[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11762 1726853248.99669: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11762 1726853249.00467: Added group all to inventory 11762 1726853249.00470: Added group ungrouped to inventory 11762 1726853249.00476: Group all now contains ungrouped 11762 1726853249.00480: Examining possible inventory source: /tmp/network-iHm/inventory.yml 11762 1726853249.27185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11762 1726853249.27311: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11762 1726853249.27449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11762 1726853249.27514: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11762 1726853249.27709: Loaded config def from plugin (inventory/script) 11762 1726853249.27711: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11762 1726853249.27751: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11762 1726853249.27957: Loaded config def from plugin (inventory/yaml) 11762 1726853249.27959: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11762 1726853249.28156: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11762 1726853249.29122: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11762 1726853249.29126: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11762 1726853249.29129: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11762 1726853249.29135: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11762 1726853249.29139: Loading data from /tmp/network-iHm/inventory.yml 11762 1726853249.29329: /tmp/network-iHm/inventory.yml was not parsable by auto 11762 1726853249.29460: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11762 1726853249.29617: Loading data from /tmp/network-iHm/inventory.yml 11762 1726853249.29706: group all already in inventory 11762 1726853249.29713: set inventory_file for managed_node1 11762 1726853249.29833: set inventory_dir for managed_node1 11762 1726853249.29835: Added host managed_node1 to inventory 11762 1726853249.29837: Added host managed_node1 to group all 11762 1726853249.29838: set ansible_host for managed_node1 11762 1726853249.29839: set ansible_ssh_extra_args for managed_node1 11762 1726853249.29843: set inventory_file for managed_node2 11762 1726853249.29846: set inventory_dir for managed_node2 11762 1726853249.29847: Added host managed_node2 to inventory 11762 1726853249.29849: Added host managed_node2 to group all 11762 1726853249.29850: set ansible_host for managed_node2 11762 1726853249.29850: set ansible_ssh_extra_args for managed_node2 11762 1726853249.29853: set inventory_file for managed_node3 11762 1726853249.29856: set inventory_dir for managed_node3 11762 1726853249.29857: Added host managed_node3 to inventory 11762 1726853249.29858: Added host managed_node3 to group all 11762 1726853249.29859: set ansible_host for managed_node3 11762 1726853249.29859: set ansible_ssh_extra_args for managed_node3 11762 1726853249.29862: Reconcile groups and hosts in inventory. 11762 1726853249.29866: Group ungrouped now contains managed_node1 11762 1726853249.29874: Group ungrouped now contains managed_node2 11762 1726853249.29876: Group ungrouped now contains managed_node3 11762 1726853249.30073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11762 1726853249.30322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11762 1726853249.30487: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11762 1726853249.30515: Loaded config def from plugin (vars/host_group_vars) 11762 1726853249.30517: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11762 1726853249.30529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11762 1726853249.30537: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11762 1726853249.30645: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11762 1726853249.31329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853249.31540: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11762 1726853249.31695: Loaded config def from plugin (connection/local) 11762 1726853249.31698: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11762 1726853249.33156: Loaded config def from plugin (connection/paramiko_ssh) 11762 1726853249.33160: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11762 1726853249.35038: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11762 1726853249.35213: Loaded config def from plugin (connection/psrp) 11762 1726853249.35216: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11762 1726853249.36806: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11762 1726853249.36910: Loaded config def from plugin (connection/ssh) 11762 1726853249.36914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11762 1726853249.43047: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11762 1726853249.43202: Loaded config def from plugin (connection/winrm) 11762 1726853249.43206: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11762 1726853249.43238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11762 1726853249.43623: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11762 1726853249.43775: Loaded config def from plugin (shell/cmd) 11762 1726853249.43778: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11762 1726853249.43813: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11762 1726853249.44033: Loaded config def from plugin (shell/powershell) 11762 1726853249.44035: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11762 1726853249.44092: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11762 1726853249.44515: Loaded config def from plugin (shell/sh) 11762 1726853249.44517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11762 1726853249.44550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11762 1726853249.44910: Loaded config def from plugin (become/runas) 11762 1726853249.44913: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11762 1726853249.45297: Loaded config def from plugin (become/su) 11762 1726853249.45299: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11762 1726853249.45691: Loaded config def from plugin (become/sudo) 11762 1726853249.45693: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11762 1726853249.45725: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 11762 1726853249.46555: in VariableManager get_vars() 11762 1726853249.46581: done with get_vars() 11762 1726853249.46836: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11762 1726853249.52234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11762 1726853249.52362: in VariableManager get_vars() 11762 1726853249.52366: done with get_vars() 11762 1726853249.52369: variable 'playbook_dir' from source: magic vars 11762 1726853249.52372: variable 'ansible_playbook_python' from source: magic vars 11762 1726853249.52373: variable 'ansible_config_file' from source: magic vars 11762 1726853249.52374: variable 'groups' from source: magic vars 11762 1726853249.52375: variable 'omit' from source: magic vars 11762 1726853249.52375: variable 'ansible_version' from source: magic vars 11762 1726853249.52376: variable 'ansible_check_mode' from source: magic vars 11762 1726853249.52377: variable 'ansible_diff_mode' from source: magic vars 11762 1726853249.52378: variable 'ansible_forks' from source: magic vars 11762 1726853249.52378: variable 'ansible_inventory_sources' from source: magic vars 11762 1726853249.52379: variable 'ansible_skip_tags' from source: magic vars 11762 1726853249.52380: variable 'ansible_limit' from source: magic vars 11762 1726853249.52380: variable 'ansible_run_tags' from source: magic vars 11762 1726853249.52381: variable 'ansible_verbosity' from source: magic vars 11762 1726853249.52416: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml 11762 1726853249.53240: in VariableManager get_vars() 11762 1726853249.53257: done with get_vars() 11762 1726853249.53506: in VariableManager get_vars() 11762 1726853249.53520: done with get_vars() 11762 1726853249.53573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11762 1726853249.53587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11762 1726853249.54154: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11762 1726853249.54474: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11762 1726853249.54477: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 11762 1726853249.54515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11762 1726853249.54566: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11762 1726853249.54781: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11762 1726853249.54854: Loaded config def from plugin (callback/default) 11762 1726853249.54856: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11762 1726853249.56670: Loaded config def from plugin (callback/junit) 11762 1726853249.56675: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11762 1726853249.56725: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11762 1726853249.56784: Loaded config def from plugin (callback/minimal) 11762 1726853249.56787: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11762 1726853249.56831: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11762 1726853249.56900: Loaded config def from plugin (callback/tree) 11762 1726853249.56903: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11762 1726853249.57031: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11762 1726853249.57034: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_options_nm.yml ******************************************** 2 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 11762 1726853249.57074: in VariableManager get_vars() 11762 1726853249.57089: done with get_vars() 11762 1726853249.57095: in VariableManager get_vars() 11762 1726853249.57103: done with get_vars() 11762 1726853249.57107: variable 'omit' from source: magic vars 11762 1726853249.57144: in VariableManager get_vars() 11762 1726853249.57166: done with get_vars() 11762 1726853249.57189: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_options.yml' with nm as provider] ***** 11762 1726853249.57890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11762 1726853249.60085: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11762 1726853249.60125: getting the remaining hosts for this loop 11762 1726853249.60127: done getting the remaining hosts for this loop 11762 1726853249.60130: getting the next task for host managed_node2 11762 1726853249.60134: done getting next task for host managed_node2 11762 1726853249.60135: ^ task is: TASK: Gathering Facts 11762 1726853249.60137: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853249.60139: getting variables 11762 1726853249.60140: in VariableManager get_vars() 11762 1726853249.60150: Calling all_inventory to load vars for managed_node2 11762 1726853249.60153: Calling groups_inventory to load vars for managed_node2 11762 1726853249.60155: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853249.60167: Calling all_plugins_play to load vars for managed_node2 11762 1726853249.60182: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853249.60186: Calling groups_plugins_play to load vars for managed_node2 11762 1726853249.60228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853249.60280: done with get_vars() 11762 1726853249.60287: done getting variables 11762 1726853249.60360: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 Friday 20 September 2024 13:27:29 -0400 (0:00:00.034) 0:00:00.034 ****** 11762 1726853249.60384: entering _queue_task() for managed_node2/gather_facts 11762 1726853249.60385: Creating lock for gather_facts 11762 1726853249.60799: worker is 1 (out of 1 available) 11762 1726853249.60812: exiting _queue_task() for managed_node2/gather_facts 11762 1726853249.60827: done queuing things up, now waiting for results queue to drain 11762 1726853249.60830: waiting for pending results... 11762 1726853249.61033: running TaskExecutor() for managed_node2/TASK: Gathering Facts 11762 1726853249.61123: in run() - task 02083763-bbaf-d845-03d0-000000000015 11762 1726853249.61149: variable 'ansible_search_path' from source: unknown 11762 1726853249.61200: calling self._execute() 11762 1726853249.61296: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853249.61322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853249.61334: variable 'omit' from source: magic vars 11762 1726853249.61439: variable 'omit' from source: magic vars 11762 1726853249.61476: variable 'omit' from source: magic vars 11762 1726853249.61622: variable 'omit' from source: magic vars 11762 1726853249.61625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853249.61627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853249.61649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853249.61675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853249.61691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853249.61728: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853249.61741: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853249.61752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853249.61863: Set connection var ansible_timeout to 10 11762 1726853249.61873: Set connection var ansible_shell_type to sh 11762 1726853249.61951: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853249.61955: Set connection var ansible_shell_executable to /bin/sh 11762 1726853249.61958: Set connection var ansible_pipelining to False 11762 1726853249.61960: Set connection var ansible_connection to ssh 11762 1726853249.61962: variable 'ansible_shell_executable' from source: unknown 11762 1726853249.61964: variable 'ansible_connection' from source: unknown 11762 1726853249.61966: variable 'ansible_module_compression' from source: unknown 11762 1726853249.61967: variable 'ansible_shell_type' from source: unknown 11762 1726853249.61969: variable 'ansible_shell_executable' from source: unknown 11762 1726853249.62058: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853249.62061: variable 'ansible_pipelining' from source: unknown 11762 1726853249.62063: variable 'ansible_timeout' from source: unknown 11762 1726853249.62065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853249.62235: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 11762 1726853249.62253: variable 'omit' from source: magic vars 11762 1726853249.62261: starting attempt loop 11762 1726853249.62275: running the handler 11762 1726853249.62297: variable 'ansible_facts' from source: unknown 11762 1726853249.62323: _low_level_execute_command(): starting 11762 1726853249.62336: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853249.63154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853249.63235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853249.63273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853249.63382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853249.65154: stdout chunk (state=3): >>>/root <<< 11762 1726853249.65418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853249.65432: stderr chunk (state=3): >>><<< 11762 1726853249.65441: stdout chunk (state=3): >>><<< 11762 1726853249.65629: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853249.65634: _low_level_execute_command(): starting 11762 1726853249.65637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571 `" && echo ansible-tmp-1726853249.6549523-11797-224052497572571="` echo /root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571 `" ) && sleep 0' 11762 1726853249.66526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853249.66540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853249.66555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853249.66589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853249.66697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853249.66731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853249.66881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853249.68951: stdout chunk (state=3): >>>ansible-tmp-1726853249.6549523-11797-224052497572571=/root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571 <<< 11762 1726853249.69179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853249.69216: stdout chunk (state=3): >>><<< 11762 1726853249.69303: stderr chunk (state=3): >>><<< 11762 1726853249.69382: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853249.6549523-11797-224052497572571=/root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853249.69516: variable 'ansible_module_compression' from source: unknown 11762 1726853249.69660: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11762 1726853249.69675: ANSIBALLZ: Acquiring lock 11762 1726853249.69684: ANSIBALLZ: Lock acquired: 139956166284816 11762 1726853249.69748: ANSIBALLZ: Creating module 11762 1726853250.18287: ANSIBALLZ: Writing module into payload 11762 1726853250.18678: ANSIBALLZ: Writing module 11762 1726853250.18681: ANSIBALLZ: Renaming module 11762 1726853250.18684: ANSIBALLZ: Done creating module 11762 1726853250.18980: variable 'ansible_facts' from source: unknown 11762 1726853250.18984: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853250.18986: _low_level_execute_command(): starting 11762 1726853250.18989: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11762 1726853250.20177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853250.20192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853250.20206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853250.20225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853250.20243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853250.20259: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853250.20355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853250.20585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853250.20887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853250.22939: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 11762 1726853250.23020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853250.23025: stdout chunk (state=3): >>><<< 11762 1726853250.23027: stderr chunk (state=3): >>><<< 11762 1726853250.23168: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853250.23177 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 11762 1726853250.23183: _low_level_execute_command(): starting 11762 1726853250.23187: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 11762 1726853250.23517: Sending initial data 11762 1726853250.23520: Sent initial data (1181 bytes) 11762 1726853250.24428: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853250.24806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853250.28265: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11762 1726853250.28811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853250.29278: stderr chunk (state=3): >>><<< 11762 1726853250.29282: stdout chunk (state=3): >>><<< 11762 1726853250.29286: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853250.29289: variable 'ansible_facts' from source: unknown 11762 1726853250.29291: variable 'ansible_facts' from source: unknown 11762 1726853250.29294: variable 'ansible_module_compression' from source: unknown 11762 1726853250.29297: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11762 1726853250.29977: variable 'ansible_facts' from source: unknown 11762 1726853250.29982: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571/AnsiballZ_setup.py 11762 1726853250.30701: Sending initial data 11762 1726853250.30711: Sent initial data (154 bytes) 11762 1726853250.31688: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853250.31888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853250.32181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853250.33880: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853250.33966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853250.34151: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmppi0tya9o /root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571/AnsiballZ_setup.py <<< 11762 1726853250.34162: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571/AnsiballZ_setup.py" <<< 11762 1726853250.34217: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmppi0tya9o" to remote "/root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571/AnsiballZ_setup.py" <<< 11762 1726853250.38253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853250.38331: stderr chunk (state=3): >>><<< 11762 1726853250.38340: stdout chunk (state=3): >>><<< 11762 1726853250.38366: done transferring module to remote 11762 1726853250.38389: _low_level_execute_command(): starting 11762 1726853250.38400: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571/ /root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571/AnsiballZ_setup.py && sleep 0' 11762 1726853250.40189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853250.40209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853250.40224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853250.40484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853250.42715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853250.42719: stdout chunk (state=3): >>><<< 11762 1726853250.42722: stderr chunk (state=3): >>><<< 11762 1726853250.42738: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853250.42747: _low_level_execute_command(): starting 11762 1726853250.42757: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571/AnsiballZ_setup.py && sleep 0' 11762 1726853250.43974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853250.44008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853250.44025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853250.44163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853250.44290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853250.44311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853250.44441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853250.46803: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11762 1726853250.47005: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 11762 1726853250.47039: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 11762 1726853250.47073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 11762 1726853250.47089: stdout chunk (state=3): >>>import 'codecs' # <<< 11762 1726853250.47132: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11762 1726853250.47158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adf684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adf37b30> <<< 11762 1726853250.47195: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adf6aa50> <<< 11762 1726853250.47243: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 11762 1726853250.47306: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 11762 1726853250.47394: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11762 1726853250.47433: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11762 1726853250.47553: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 11762 1726853250.47617: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11762 1726853250.47624: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 11762 1726853250.47626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add1d130> <<< 11762 1726853250.47695: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add1dfa0> import 'site' # <<< 11762 1726853250.47699: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11762 1726853250.48094: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11762 1726853250.48213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11762 1726853250.48232: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 11762 1726853250.48257: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add5be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11762 1726853250.48289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11762 1726853250.48474: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add5bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11762 1726853250.48479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11762 1726853250.48504: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add937d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add93e60> import '_collections' # <<< 11762 1726853250.48567: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add73ad0> <<< 11762 1726853250.48599: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add711f0> <<< 11762 1726853250.48697: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add58fb0> <<< 11762 1726853250.48907: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addb3770> <<< 11762 1726853250.48911: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addb2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add72090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addb0bc0> <<< 11762 1726853250.48985: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 11762 1726853250.48994: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adde8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add58230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11762 1726853250.49022: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853250.49323: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adde8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adde8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adde8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add56d50> <<< 11762 1726853250.49329: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adde9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adde9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addea480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11762 1726853250.49358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ade006b0> import 'errno' # <<< 11762 1726853250.49437: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ade01d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11762 1726853250.49479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ade02c30> <<< 11762 1726853250.49837: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ade03290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ade02180> <<< 11762 1726853250.49841: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11762 1726853250.49867: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ade03d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ade03440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addea4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adb0bbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adb346b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb34410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adb346e0> <<< 11762 1726853250.49893: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11762 1726853250.49950: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853250.50083: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adb35010> <<< 11762 1726853250.50248: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adb35a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb348c0> <<< 11762 1726853250.50287: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb09d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11762 1726853250.50374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 11762 1726853250.50489: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb36e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb35b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addeabd0> <<< 11762 1726853250.50492: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11762 1726853250.50519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11762 1726853250.50551: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb63140> <<< 11762 1726853250.50623: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11762 1726853250.50652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11762 1726853250.50666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11762 1726853250.50724: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb83500> <<< 11762 1726853250.50728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11762 1726853250.50768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11762 1726853250.50848: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 11762 1726853250.50869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adbe4260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11762 1726853250.50909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11762 1726853250.50958: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11762 1726853250.50969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11762 1726853250.51051: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adbe69c0> <<< 11762 1726853250.51139: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adbe4380> <<< 11762 1726853250.51232: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adbad280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad525370> <<< 11762 1726853250.51352: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb82300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb37d40> <<< 11762 1726853250.51416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11762 1726853250.51494: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd0adb82420> <<< 11762 1726853250.51996: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_sp9zm2os/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 11762 1726853250.52040: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.52082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11762 1726853250.52093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11762 1726853250.52269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11762 1726853250.52276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad58aff0> import '_typing' # <<< 11762 1726853250.52451: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad569ee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad569040> # zipimport: zlib available <<< 11762 1726853250.52524: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 11762 1726853250.52600: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 11762 1726853250.54092: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.55220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 11762 1726853250.55260: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad588ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 11762 1726853250.55420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad5be900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad5be690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad5bdfa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11762 1726853250.55551: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad5be3f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad58bc80> import 'atexit' # <<< 11762 1726853250.55555: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad5bf6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad5bf8f0> <<< 11762 1726853250.55558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11762 1726853250.55614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 11762 1726853250.55708: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad5bfe30> import 'pwd' # <<< 11762 1726853250.55712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11762 1726853250.55791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11762 1726853250.55818: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad425bb0> <<< 11762 1726853250.55823: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad4277d0> <<< 11762 1726853250.55940: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11762 1726853250.55943: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad428170> <<< 11762 1726853250.55981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11762 1726853250.55987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11762 1726853250.56068: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad429310> <<< 11762 1726853250.56076: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11762 1726853250.56184: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad42bda0> <<< 11762 1726853250.56187: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad56b0e0> <<< 11762 1726853250.56189: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad42a060> <<< 11762 1726853250.56211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11762 1726853250.56448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad433d40> import '_tokenize' # <<< 11762 1726853250.56523: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad432810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad432570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11762 1726853250.56553: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad432ae0> <<< 11762 1726853250.56623: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad42a570> <<< 11762 1726853250.56689: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad477f80> <<< 11762 1726853250.56733: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad4780e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11762 1726853250.56863: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad479bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad479970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11762 1726853250.56878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad47c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad47a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11762 1726853250.56930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853250.56967: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 11762 1726853250.57105: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad47f860> <<< 11762 1726853250.57128: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad47c230> <<< 11762 1726853250.57214: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad480b90> <<< 11762 1726853250.57234: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad480710> <<< 11762 1726853250.57274: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad480230> <<< 11762 1726853250.57412: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad4782f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 11762 1726853250.57441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad30c110> <<< 11762 1726853250.57565: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853250.57594: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad30d040> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad4828a0> <<< 11762 1726853250.57904: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad483c50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad482540> # zipimport: zlib available <<< 11762 1726853250.57908: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11762 1726853250.57931: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11762 1726853250.58027: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.58148: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.58713: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.59289: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11762 1726853250.59307: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 11762 1726853250.59329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853250.59396: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad311220> <<< 11762 1726853250.59466: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 11762 1726853250.59488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11762 1726853250.59495: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad311fd0> <<< 11762 1726853250.59501: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad30d1f0> <<< 11762 1726853250.59550: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11762 1726853250.59576: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.59596: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 11762 1726853250.59605: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.59754: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.59918: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 11762 1726853250.59925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11762 1726853250.59928: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad312210> <<< 11762 1726853250.59948: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.60421: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.60873: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.60940: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.61020: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11762 1726853250.61029: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.61077: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.61117: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 11762 1726853250.61201: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.61291: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 11762 1726853250.61325: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11762 1726853250.61393: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.61492: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11762 1726853250.61701: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.61878: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11762 1726853250.61947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11762 1726853250.61975: stdout chunk (state=3): >>>import '_ast' # <<< 11762 1726853250.62030: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3132c0> <<< 11762 1726853250.62076: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.62152: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.62279: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 11762 1726853250.62296: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 11762 1726853250.62313: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.62429: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 11762 1726853250.62463: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.62587: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.62648: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853250.62747: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad31de80> <<< 11762 1726853250.62770: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad31bf50> <<< 11762 1726853250.62918: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11762 1726853250.62994: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.63054: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853250.63080: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11762 1726853250.63203: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11762 1726853250.63235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11762 1726853250.63308: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad4066f0> <<< 11762 1726853250.63366: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad4fe3c0> <<< 11762 1726853250.63439: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad31dbb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad313ad0> # destroy ansible.module_utils.distro <<< 11762 1726853250.63453: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 11762 1726853250.63476: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.63512: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 11762 1726853250.63522: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 11762 1726853250.63574: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11762 1726853250.63597: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.63612: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 11762 1726853250.63686: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.63746: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.63781: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.63797: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.63832: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.63903: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.63985: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.63990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 11762 1726853250.63993: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.64056: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.64132: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.64149: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.64195: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 11762 1726853250.64200: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.64389: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.64559: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.64608: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.64665: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853250.64690: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 11762 1726853250.64711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 11762 1726853250.64728: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 11762 1726853250.64761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 11762 1726853250.64782: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3b1a90> <<< 11762 1726853250.64807: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 11762 1726853250.64817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11762 1726853250.64841: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11762 1726853250.64887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11762 1726853250.64911: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11762 1726853250.64922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 11762 1726853250.64940: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acf33ce0> <<< 11762 1726853250.64969: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853250.64995: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acf33fe0> <<< 11762 1726853250.65043: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad39a6c0> <<< 11762 1726853250.65067: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3b2600> <<< 11762 1726853250.65093: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3b0170> <<< 11762 1726853250.65111: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3b0b60> <<< 11762 1726853250.65129: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11762 1726853250.65209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11762 1726853250.65224: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 11762 1726853250.65240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11762 1726853250.65260: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 11762 1726853250.65273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 11762 1726853250.65306: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acf4ef60> <<< 11762 1726853250.65315: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acf4e810> <<< 11762 1726853250.65338: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acf4e9f0> <<< 11762 1726853250.65363: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acf4dc70> <<< 11762 1726853250.65380: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11762 1726853250.65523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 11762 1726853250.65530: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acf4f110> <<< 11762 1726853250.65554: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11762 1726853250.65589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11762 1726853250.65624: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acfa5c10> <<< 11762 1726853250.65653: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acf4fbf0> <<< 11762 1726853250.65702: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3b3d10> <<< 11762 1726853250.65776: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 11762 1726853250.65780: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 11762 1726853250.65889: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.65905: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11762 1726853250.65908: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.65989: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.66034: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 11762 1726853250.66092: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 11762 1726853250.66119: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.66234: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 11762 1726853250.66237: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 11762 1726853250.66360: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # <<< 11762 1726853250.66363: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.66393: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.66481: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.66586: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.66589: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 11762 1726853250.66810: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.67114: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.67524: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 11762 1726853250.67539: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.67587: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.67647: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.67678: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.67721: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 11762 1726853250.67724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 11762 1726853250.67763: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11762 1726853250.67800: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 11762 1726853250.67809: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.67877: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.67924: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 11762 1726853250.67974: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.67977: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.68018: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 11762 1726853250.68047: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.68089: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11762 1726853250.68103: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.68169: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.68261: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11762 1726853250.68292: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acfa7230> <<< 11762 1726853250.68314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11762 1726853250.68354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11762 1726853250.68492: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acfa67e0> import 'ansible.module_utils.facts.system.local' # <<< 11762 1726853250.68495: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.68567: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.68627: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11762 1726853250.68633: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.68722: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.68814: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 11762 1726853250.68820: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.68893: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.68961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 11762 1726853250.68980: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.69021: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.69066: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11762 1726853250.69119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11762 1726853250.69201: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853250.69265: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853250.69267: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acfd5f40> <<< 11762 1726853250.69459: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad312f90> <<< 11762 1726853250.69462: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 11762 1726853250.69474: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.69534: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.69584: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 11762 1726853250.69603: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.69689: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.69774: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.69885: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70032: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 11762 1726853250.70048: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70091: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 11762 1726853250.70138: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70186: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 11762 1726853250.70241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11762 1726853250.70276: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853250.70304: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acfe9a60> <<< 11762 1726853250.70318: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acfc6ea0> import 'ansible.module_utils.facts.system.user' # <<< 11762 1726853250.70328: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70340: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70347: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 11762 1726853250.70360: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70409: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70441: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 11762 1726853250.70458: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70617: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70772: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 11762 1726853250.70778: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70885: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.70985: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.71031: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.71080: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 11762 1726853250.71084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 11762 1726853250.71090: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.71111: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.71139: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.71286: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.71437: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 11762 1726853250.71458: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.71576: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.71716: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11762 1726853250.71720: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.71741: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.71777: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.72350: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.72863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 11762 1726853250.72874: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 11762 1726853250.72890: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.72994: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.73107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 11762 1726853250.73110: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.73213: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.73319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11762 1726853250.73330: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.73480: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.73636: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11762 1726853250.73644: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.73666: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 11762 1726853250.73684: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.73728: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.73777: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 11762 1726853250.73786: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.73886: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.73987: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74190: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74392: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 11762 1726853250.74414: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74447: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74492: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 11762 1726853250.74497: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74526: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 11762 1726853250.74557: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74636: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74705: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 11762 1726853250.74711: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74741: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74759: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 11762 1726853250.74776: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74834: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74896: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 11762 1726853250.74902: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.74967: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75018: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 11762 1726853250.75034: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75307: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 11762 1726853250.75582: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75642: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11762 1726853250.75721: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75747: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75788: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 11762 1726853250.75800: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75834: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75858: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 11762 1726853250.75877: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75905: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.75944: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 11762 1726853250.75952: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76032: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76108: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11762 1726853250.76129: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76141: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 11762 1726853250.76159: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76210: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76245: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 11762 1726853250.76268: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76287: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76306: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76351: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76407: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76480: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76551: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 11762 1726853250.76563: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 11762 1726853250.76570: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76626: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76680: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 11762 1726853250.76689: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.76892: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.77087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11762 1726853250.77094: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.77143: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.77184: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11762 1726853250.77201: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.77244: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.77294: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 11762 1726853250.77299: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.77386: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.77467: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 11762 1726853250.77477: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 11762 1726853250.77487: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.77577: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.77667: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 11762 1726853250.77673: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11762 1726853250.77753: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853250.78352: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 11762 1726853250.78388: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11762 1726853250.78398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11762 1726853250.78438: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853250.78445: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acd830e0> <<< 11762 1726853250.78452: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acd83f20> <<< 11762 1726853250.78504: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acd793d0> <<< 11762 1726853250.91818: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 11762 1726853250.91833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 11762 1726853250.91839: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acdcabd0> <<< 11762 1726853250.91864: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 11762 1726853250.91890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 11762 1726853250.91910: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acdc9250> <<< 11762 1726853250.91967: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 11762 1726853250.91974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853250.92007: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 11762 1726853250.92014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acdcb5c0> <<< 11762 1726853250.92056: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acdca1b0> <<< 11762 1726853250.92336: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11762 1726853251.16584: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.65185546875, "5m": 0.38134765625, "15m": 0.1826171875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "30", "epoch": "1726853250", "epoch_int": "1726853250", "date": "2024-09-20", "time": "13:27:30", "iso8601_micro": "2024-09-20T17:27:30.787488Z", "iso8601": "2024-09-20T17:27:30Z", "iso8601_basic": "20240920T132730787488", "iso8601_basic_short": "20240920T132730", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh<<< 11762 1726853251.16630: stdout chunk (state=3): >>>6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_uuid": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": <<< 11762 1726853251.16658: stdout chunk (state=3): >>>null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 461, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794066432, "block_size": 4096, "block_total": 65519099, "block_available": 63914567, "block_used": 1604532, "inode_total": 131070960, "inode_available": 131029089, "inode_used": 41871, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10bc:daff:fe29:a445", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.197"], "ansible_all_ipv6_addresses": ["fe80::10bc:daff:fe29:a445"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.197", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10bc:daff:fe29:a445"]}, "ansible_lsb": {}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11762 1726853251.17269: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type <<< 11762 1726853251.17297: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 11762 1726853251.17381: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap<<< 11762 1726853251.17386: stdout chunk (state=3): >>> # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile <<< 11762 1726853251.17390: stdout chunk (state=3): >>># cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder <<< 11762 1726853251.17409: stdout chunk (state=3): >>># cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon <<< 11762 1726853251.17456: stdout chunk (state=3): >>># cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file <<< 11762 1726853251.17493: stdout chunk (state=3): >>># destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb <<< 11762 1726853251.17563: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base <<< 11762 1726853251.17582: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11762 1726853251.17981: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 11762 1726853251.18034: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 11762 1726853251.18058: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 11762 1726853251.18122: stdout chunk (state=3): >>># destroy ntpath <<< 11762 1726853251.18133: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 11762 1726853251.18176: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 11762 1726853251.18216: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 11762 1726853251.18239: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 11762 1726853251.18314: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 11762 1726853251.18380: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl <<< 11762 1726853251.18385: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 11762 1726853251.18458: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 11762 1726853251.18483: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 11762 1726853251.18510: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 11762 1726853251.18586: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 11762 1726853251.18651: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 11762 1726853251.18743: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 11762 1726853251.18790: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 11762 1726853251.18793: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11762 1726853251.18949: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11762 1726853251.19012: stdout chunk (state=3): >>># destroy _collections <<< 11762 1726853251.19016: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 11762 1726853251.19043: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 11762 1726853251.19077: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 11762 1726853251.19111: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 11762 1726853251.19132: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11762 1726853251.19221: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig<<< 11762 1726853251.19262: stdout chunk (state=3): >>> # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 11762 1726853251.19327: stdout chunk (state=3): >>># destroy _hashlib <<< 11762 1726853251.19333: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 11762 1726853251.19362: stdout chunk (state=3): >>># clear sys.audit hooks <<< 11762 1726853251.19882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853251.19885: stdout chunk (state=3): >>><<< 11762 1726853251.19888: stderr chunk (state=3): >>><<< 11762 1726853251.20002: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adf684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adf37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adf6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add1d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add1dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add5be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add5bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add937d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add93e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add73ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add711f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add58fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addb3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addb2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add72090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addb0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adde8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add58230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adde8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adde8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adde8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0add56d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adde9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adde9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addea480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ade006b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ade01d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ade02c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ade03290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ade02180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ade03d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ade03440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addea4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adb0bbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adb346b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb34410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adb346e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adb35010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0adb35a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb348c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb09d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb36e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb35b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0addeabd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb63140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb83500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adbe4260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adbe69c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adbe4380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adbad280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad525370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb82300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0adb37d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd0adb82420> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_sp9zm2os/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad58aff0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad569ee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad569040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad588ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad5be900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad5be690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad5bdfa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad5be3f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad58bc80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad5bf6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad5bf8f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad5bfe30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad425bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad4277d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad428170> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad429310> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad42bda0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad56b0e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad42a060> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad433d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad432810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad432570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad432ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad42a570> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad477f80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad4780e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad479bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad479970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad47c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad47a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad47f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad47c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad480b90> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad480710> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad480230> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad4782f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad30c110> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad30d040> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad4828a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad483c50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad482540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad311220> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad311fd0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad30d1f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad312210> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3132c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0ad31de80> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad31bf50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad4066f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad4fe3c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad31dbb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad313ad0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3b1a90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acf33ce0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acf33fe0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad39a6c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3b2600> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3b0170> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3b0b60> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acf4ef60> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acf4e810> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acf4e9f0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acf4dc70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acf4f110> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acfa5c10> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acf4fbf0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad3b3d10> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acfa7230> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acfa67e0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acfd5f40> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0ad312f90> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acfe9a60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acfc6ea0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0acd830e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acd83f20> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acd793d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acdcabd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acdc9250> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acdcb5c0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0acdca1b0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.65185546875, "5m": 0.38134765625, "15m": 0.1826171875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "30", "epoch": "1726853250", "epoch_int": "1726853250", "date": "2024-09-20", "time": "13:27:30", "iso8601_micro": "2024-09-20T17:27:30.787488Z", "iso8601": "2024-09-20T17:27:30Z", "iso8601_basic": "20240920T132730787488", "iso8601_basic_short": "20240920T132730", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_uuid": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 461, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794066432, "block_size": 4096, "block_total": 65519099, "block_available": 63914567, "block_used": 1604532, "inode_total": 131070960, "inode_available": 131029089, "inode_used": 41871, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10bc:daff:fe29:a445", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.197"], "ansible_all_ipv6_addresses": ["fe80::10bc:daff:fe29:a445"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.197", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10bc:daff:fe29:a445"]}, "ansible_lsb": {}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11762 1726853251.21417: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853251.21420: _low_level_execute_command(): starting 11762 1726853251.21424: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853249.6549523-11797-224052497572571/ > /dev/null 2>&1 && sleep 0' 11762 1726853251.21728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853251.21744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853251.21785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853251.21799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853251.21877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853251.21890: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853251.21906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853251.22016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853251.23984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853251.24002: stdout chunk (state=3): >>><<< 11762 1726853251.24021: stderr chunk (state=3): >>><<< 11762 1726853251.24039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853251.24054: handler run complete 11762 1726853251.24196: variable 'ansible_facts' from source: unknown 11762 1726853251.24308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853251.24711: variable 'ansible_facts' from source: unknown 11762 1726853251.24779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853251.24937: attempt loop complete, returning result 11762 1726853251.24949: _execute() done 11762 1726853251.24974: dumping result to json 11762 1726853251.24991: done dumping result, returning 11762 1726853251.25003: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [02083763-bbaf-d845-03d0-000000000015] 11762 1726853251.25011: sending task result for task 02083763-bbaf-d845-03d0-000000000015 ok: [managed_node2] 11762 1726853251.25880: no more pending results, returning what we have 11762 1726853251.26000: results queue empty 11762 1726853251.26002: checking for any_errors_fatal 11762 1726853251.26003: done checking for any_errors_fatal 11762 1726853251.26004: checking for max_fail_percentage 11762 1726853251.26006: done checking for max_fail_percentage 11762 1726853251.26006: checking to see if all hosts have failed and the running result is not ok 11762 1726853251.26007: done checking to see if all hosts have failed 11762 1726853251.26008: getting the remaining hosts for this loop 11762 1726853251.26010: done getting the remaining hosts for this loop 11762 1726853251.26013: getting the next task for host managed_node2 11762 1726853251.26019: done getting next task for host managed_node2 11762 1726853251.26020: ^ task is: TASK: meta (flush_handlers) 11762 1726853251.26022: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853251.26026: getting variables 11762 1726853251.26027: in VariableManager get_vars() 11762 1726853251.26053: Calling all_inventory to load vars for managed_node2 11762 1726853251.26056: Calling groups_inventory to load vars for managed_node2 11762 1726853251.26059: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853251.26123: done sending task result for task 02083763-bbaf-d845-03d0-000000000015 11762 1726853251.26127: WORKER PROCESS EXITING 11762 1726853251.26136: Calling all_plugins_play to load vars for managed_node2 11762 1726853251.26139: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853251.26142: Calling groups_plugins_play to load vars for managed_node2 11762 1726853251.26337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853251.26551: done with get_vars() 11762 1726853251.26562: done getting variables 11762 1726853251.26624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 11762 1726853251.26696: in VariableManager get_vars() 11762 1726853251.26705: Calling all_inventory to load vars for managed_node2 11762 1726853251.26707: Calling groups_inventory to load vars for managed_node2 11762 1726853251.26710: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853251.26714: Calling all_plugins_play to load vars for managed_node2 11762 1726853251.26717: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853251.26720: Calling groups_plugins_play to load vars for managed_node2 11762 1726853251.26875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853251.27061: done with get_vars() 11762 1726853251.27078: done queuing things up, now waiting for results queue to drain 11762 1726853251.27090: results queue empty 11762 1726853251.27091: checking for any_errors_fatal 11762 1726853251.27094: done checking for any_errors_fatal 11762 1726853251.27099: checking for max_fail_percentage 11762 1726853251.27101: done checking for max_fail_percentage 11762 1726853251.27101: checking to see if all hosts have failed and the running result is not ok 11762 1726853251.27102: done checking to see if all hosts have failed 11762 1726853251.27103: getting the remaining hosts for this loop 11762 1726853251.27104: done getting the remaining hosts for this loop 11762 1726853251.27106: getting the next task for host managed_node2 11762 1726853251.27111: done getting next task for host managed_node2 11762 1726853251.27114: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11762 1726853251.27115: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853251.27117: getting variables 11762 1726853251.27118: in VariableManager get_vars() 11762 1726853251.27126: Calling all_inventory to load vars for managed_node2 11762 1726853251.27128: Calling groups_inventory to load vars for managed_node2 11762 1726853251.27130: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853251.27135: Calling all_plugins_play to load vars for managed_node2 11762 1726853251.27137: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853251.27140: Calling groups_plugins_play to load vars for managed_node2 11762 1726853251.27289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853251.27508: done with get_vars() 11762 1726853251.27524: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:11 Friday 20 September 2024 13:27:31 -0400 (0:00:01.672) 0:00:01.706 ****** 11762 1726853251.27603: entering _queue_task() for managed_node2/include_tasks 11762 1726853251.27604: Creating lock for include_tasks 11762 1726853251.27965: worker is 1 (out of 1 available) 11762 1726853251.27980: exiting _queue_task() for managed_node2/include_tasks 11762 1726853251.27994: done queuing things up, now waiting for results queue to drain 11762 1726853251.27995: waiting for pending results... 11762 1726853251.28133: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 11762 1726853251.28204: in run() - task 02083763-bbaf-d845-03d0-000000000006 11762 1726853251.28217: variable 'ansible_search_path' from source: unknown 11762 1726853251.28245: calling self._execute() 11762 1726853251.28302: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853251.28305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853251.28313: variable 'omit' from source: magic vars 11762 1726853251.28387: _execute() done 11762 1726853251.28392: dumping result to json 11762 1726853251.28396: done dumping result, returning 11762 1726853251.28398: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-d845-03d0-000000000006] 11762 1726853251.28408: sending task result for task 02083763-bbaf-d845-03d0-000000000006 11762 1726853251.28490: done sending task result for task 02083763-bbaf-d845-03d0-000000000006 11762 1726853251.28493: WORKER PROCESS EXITING 11762 1726853251.28530: no more pending results, returning what we have 11762 1726853251.28534: in VariableManager get_vars() 11762 1726853251.28564: Calling all_inventory to load vars for managed_node2 11762 1726853251.28567: Calling groups_inventory to load vars for managed_node2 11762 1726853251.28569: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853251.28579: Calling all_plugins_play to load vars for managed_node2 11762 1726853251.28581: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853251.28583: Calling groups_plugins_play to load vars for managed_node2 11762 1726853251.28698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853251.28807: done with get_vars() 11762 1726853251.28812: variable 'ansible_search_path' from source: unknown 11762 1726853251.28822: we have included files to process 11762 1726853251.28822: generating all_blocks data 11762 1726853251.28823: done generating all_blocks data 11762 1726853251.28824: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11762 1726853251.28825: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11762 1726853251.28826: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11762 1726853251.29258: in VariableManager get_vars() 11762 1726853251.29268: done with get_vars() 11762 1726853251.29277: done processing included file 11762 1726853251.29278: iterating over new_blocks loaded from include file 11762 1726853251.29279: in VariableManager get_vars() 11762 1726853251.29285: done with get_vars() 11762 1726853251.29285: filtering new block on tags 11762 1726853251.29297: done filtering new block on tags 11762 1726853251.29299: in VariableManager get_vars() 11762 1726853251.29305: done with get_vars() 11762 1726853251.29306: filtering new block on tags 11762 1726853251.29315: done filtering new block on tags 11762 1726853251.29316: in VariableManager get_vars() 11762 1726853251.29322: done with get_vars() 11762 1726853251.29323: filtering new block on tags 11762 1726853251.29330: done filtering new block on tags 11762 1726853251.29331: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 11762 1726853251.29335: extending task lists for all hosts with included blocks 11762 1726853251.29363: done extending task lists 11762 1726853251.29363: done processing included files 11762 1726853251.29364: results queue empty 11762 1726853251.29364: checking for any_errors_fatal 11762 1726853251.29365: done checking for any_errors_fatal 11762 1726853251.29365: checking for max_fail_percentage 11762 1726853251.29366: done checking for max_fail_percentage 11762 1726853251.29366: checking to see if all hosts have failed and the running result is not ok 11762 1726853251.29367: done checking to see if all hosts have failed 11762 1726853251.29367: getting the remaining hosts for this loop 11762 1726853251.29368: done getting the remaining hosts for this loop 11762 1726853251.29369: getting the next task for host managed_node2 11762 1726853251.29373: done getting next task for host managed_node2 11762 1726853251.29375: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11762 1726853251.29376: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853251.29377: getting variables 11762 1726853251.29378: in VariableManager get_vars() 11762 1726853251.29384: Calling all_inventory to load vars for managed_node2 11762 1726853251.29385: Calling groups_inventory to load vars for managed_node2 11762 1726853251.29386: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853251.29390: Calling all_plugins_play to load vars for managed_node2 11762 1726853251.29391: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853251.29393: Calling groups_plugins_play to load vars for managed_node2 11762 1726853251.29485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853251.29593: done with get_vars() 11762 1726853251.29599: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:27:31 -0400 (0:00:00.020) 0:00:01.726 ****** 11762 1726853251.29645: entering _queue_task() for managed_node2/setup 11762 1726853251.29852: worker is 1 (out of 1 available) 11762 1726853251.29866: exiting _queue_task() for managed_node2/setup 11762 1726853251.30057: done queuing things up, now waiting for results queue to drain 11762 1726853251.30059: waiting for pending results... 11762 1726853251.30089: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 11762 1726853251.30277: in run() - task 02083763-bbaf-d845-03d0-000000000026 11762 1726853251.30281: variable 'ansible_search_path' from source: unknown 11762 1726853251.30284: variable 'ansible_search_path' from source: unknown 11762 1726853251.30286: calling self._execute() 11762 1726853251.30289: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853251.30291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853251.30293: variable 'omit' from source: magic vars 11762 1726853251.30775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853251.32233: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853251.32280: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853251.32307: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853251.32340: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853251.32364: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853251.32423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853251.32443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853251.32463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853251.32495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853251.32506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853251.32627: variable 'ansible_facts' from source: unknown 11762 1726853251.32670: variable 'network_test_required_facts' from source: task vars 11762 1726853251.32703: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 11762 1726853251.32706: variable 'omit' from source: magic vars 11762 1726853251.32732: variable 'omit' from source: magic vars 11762 1726853251.32756: variable 'omit' from source: magic vars 11762 1726853251.32776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853251.32802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853251.32844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853251.32847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853251.32854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853251.32877: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853251.32880: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853251.32883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853251.33122: Set connection var ansible_timeout to 10 11762 1726853251.33126: Set connection var ansible_shell_type to sh 11762 1726853251.33128: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853251.33131: Set connection var ansible_shell_executable to /bin/sh 11762 1726853251.33134: Set connection var ansible_pipelining to False 11762 1726853251.33135: Set connection var ansible_connection to ssh 11762 1726853251.33137: variable 'ansible_shell_executable' from source: unknown 11762 1726853251.33140: variable 'ansible_connection' from source: unknown 11762 1726853251.33141: variable 'ansible_module_compression' from source: unknown 11762 1726853251.33143: variable 'ansible_shell_type' from source: unknown 11762 1726853251.33145: variable 'ansible_shell_executable' from source: unknown 11762 1726853251.33147: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853251.33149: variable 'ansible_pipelining' from source: unknown 11762 1726853251.33151: variable 'ansible_timeout' from source: unknown 11762 1726853251.33153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853251.33290: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853251.33306: variable 'omit' from source: magic vars 11762 1726853251.33315: starting attempt loop 11762 1726853251.33322: running the handler 11762 1726853251.33348: _low_level_execute_command(): starting 11762 1726853251.33360: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853251.34127: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853251.34253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853251.34305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853251.34384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853251.36129: stdout chunk (state=3): >>>/root <<< 11762 1726853251.36287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853251.36290: stdout chunk (state=3): >>><<< 11762 1726853251.36293: stderr chunk (state=3): >>><<< 11762 1726853251.36311: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853251.36421: _low_level_execute_command(): starting 11762 1726853251.36426: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637 `" && echo ansible-tmp-1726853251.3632505-11855-163286145953637="` echo /root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637 `" ) && sleep 0' 11762 1726853251.37036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853251.37053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853251.37067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853251.37087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853251.37103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853251.37126: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853251.37187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853251.37247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853251.37275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853251.37290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853251.37395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853251.39429: stdout chunk (state=3): >>>ansible-tmp-1726853251.3632505-11855-163286145953637=/root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637 <<< 11762 1726853251.39585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853251.39677: stdout chunk (state=3): >>><<< 11762 1726853251.39681: stderr chunk (state=3): >>><<< 11762 1726853251.39684: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853251.3632505-11855-163286145953637=/root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853251.39686: variable 'ansible_module_compression' from source: unknown 11762 1726853251.39743: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11762 1726853251.39823: variable 'ansible_facts' from source: unknown 11762 1726853251.40066: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637/AnsiballZ_setup.py 11762 1726853251.40275: Sending initial data 11762 1726853251.40278: Sent initial data (154 bytes) 11762 1726853251.40907: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853251.40933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853251.41034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853251.42727: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853251.42808: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853251.42903: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpd1o3kfpa /root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637/AnsiballZ_setup.py <<< 11762 1726853251.42907: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637/AnsiballZ_setup.py" <<< 11762 1726853251.42969: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpd1o3kfpa" to remote "/root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637/AnsiballZ_setup.py" <<< 11762 1726853251.44691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853251.44758: stderr chunk (state=3): >>><<< 11762 1726853251.44877: stdout chunk (state=3): >>><<< 11762 1726853251.44880: done transferring module to remote 11762 1726853251.44882: _low_level_execute_command(): starting 11762 1726853251.44885: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637/ /root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637/AnsiballZ_setup.py && sleep 0' 11762 1726853251.45520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853251.45533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853251.45568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853251.45676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853251.45705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853251.45819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853251.47786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853251.47790: stdout chunk (state=3): >>><<< 11762 1726853251.47792: stderr chunk (state=3): >>><<< 11762 1726853251.47818: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853251.47877: _low_level_execute_command(): starting 11762 1726853251.47880: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637/AnsiballZ_setup.py && sleep 0' 11762 1726853251.48495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853251.48510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853251.48539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853251.48560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853251.48579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853251.48649: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853251.48700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853251.48715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853251.48762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853251.48884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853251.51192: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11762 1726853251.51212: stdout chunk (state=3): >>>import _imp # builtin <<< 11762 1726853251.51247: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11762 1726853251.51314: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11762 1726853251.51351: stdout chunk (state=3): >>>import 'posix' # <<< 11762 1726853251.51390: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 11762 1726853251.51399: stdout chunk (state=3): >>># installing zipimport hook <<< 11762 1726853251.51413: stdout chunk (state=3): >>>import 'time' # <<< 11762 1726853251.51426: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 11762 1726853251.51480: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 11762 1726853251.51488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.51500: stdout chunk (state=3): >>>import '_codecs' # <<< 11762 1726853251.51520: stdout chunk (state=3): >>>import 'codecs' # <<< 11762 1726853251.51557: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11762 1726853251.51594: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11762 1726853251.51597: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4b684d0> <<< 11762 1726853251.51599: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4b37b30> <<< 11762 1726853251.51627: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 11762 1726853251.51636: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4b6aa50><<< 11762 1726853251.51649: stdout chunk (state=3): >>> <<< 11762 1726853251.51655: stdout chunk (state=3): >>>import '_signal' # <<< 11762 1726853251.51683: stdout chunk (state=3): >>>import '_abc' # <<< 11762 1726853251.51704: stdout chunk (state=3): >>>import 'abc' # <<< 11762 1726853251.51711: stdout chunk (state=3): >>>import 'io' # <<< 11762 1726853251.51740: stdout chunk (state=3): >>>import '_stat' # <<< 11762 1726853251.51748: stdout chunk (state=3): >>>import 'stat' # <<< 11762 1726853251.51834: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11762 1726853251.51854: stdout chunk (state=3): >>>import 'genericpath' # <<< 11762 1726853251.51866: stdout chunk (state=3): >>>import 'posixpath' # <<< 11762 1726853251.51885: stdout chunk (state=3): >>>import 'os' # <<< 11762 1726853251.51908: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11762 1726853251.51926: stdout chunk (state=3): >>>Processing user site-packages <<< 11762 1726853251.51940: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 11762 1726853251.51951: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11762 1726853251.51973: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 11762 1726853251.51986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11762 1726853251.52002: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd493d130> <<< 11762 1726853251.52061: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11762 1726853251.52068: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.52082: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd493dfa0> <<< 11762 1726853251.52100: stdout chunk (state=3): >>>import 'site' # <<< 11762 1726853251.52140: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11762 1726853251.52511: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11762 1726853251.52532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11762 1726853251.52562: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11762 1726853251.52568: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.52595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11762 1726853251.52638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11762 1726853251.52641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11762 1726853251.52682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 11762 1726853251.52686: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd497bec0> <<< 11762 1726853251.52713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11762 1726853251.52728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11762 1726853251.52753: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd497bf80> <<< 11762 1726853251.52769: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11762 1726853251.52799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11762 1726853251.52826: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11762 1726853251.52866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.52883: stdout chunk (state=3): >>>import 'itertools' # <<< 11762 1726853251.52905: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 11762 1726853251.52939: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49b3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 11762 1726853251.52957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49b3ec0> import '_collections' # <<< 11762 1726853251.53022: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4993b60> <<< 11762 1726853251.53039: stdout chunk (state=3): >>>import '_functools' # <<< 11762 1726853251.53063: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49912b0> <<< 11762 1726853251.53147: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4979070> <<< 11762 1726853251.53190: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11762 1726853251.53193: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11762 1726853251.53222: stdout chunk (state=3): >>>import '_sre' # <<< 11762 1726853251.53224: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11762 1726853251.53259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11762 1726853251.53314: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11762 1726853251.53333: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49d37d0> <<< 11762 1726853251.53355: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49d23f0> <<< 11762 1726853251.53373: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4992150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49d0bc0> <<< 11762 1726853251.53435: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 11762 1726853251.53439: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a08890> <<< 11762 1726853251.53455: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49782f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 11762 1726853251.53466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11762 1726853251.53528: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4a08d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a08bf0> <<< 11762 1726853251.53543: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4a08fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4976e10> <<< 11762 1726853251.53575: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 11762 1726853251.53597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.53621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11762 1726853251.53647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a09670> <<< 11762 1726853251.53658: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a09370> import 'importlib.machinery' # <<< 11762 1726853251.53718: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a0a540> <<< 11762 1726853251.53753: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 11762 1726853251.53756: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11762 1726853251.53792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11762 1726853251.53823: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a20740> <<< 11762 1726853251.53848: stdout chunk (state=3): >>>import 'errno' # <<< 11762 1726853251.53865: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.53885: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4a21e20> <<< 11762 1726853251.53899: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11762 1726853251.53936: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 11762 1726853251.53940: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a22cc0> <<< 11762 1726853251.53988: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4a232f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a22210> <<< 11762 1726853251.54019: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11762 1726853251.54068: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4a23d70> <<< 11762 1726853251.54096: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a234a0> <<< 11762 1726853251.54135: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a0a4b0> <<< 11762 1726853251.54167: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11762 1726853251.54214: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11762 1726853251.54217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11762 1726853251.54253: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4717c50> <<< 11762 1726853251.54273: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 11762 1726853251.54312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11762 1726853251.54316: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd47407a0> <<< 11762 1726853251.54358: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4740500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd47407d0> <<< 11762 1726853251.54367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 11762 1726853251.54383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11762 1726853251.54442: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.54616: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4741100> <<< 11762 1726853251.54745: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4741af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd47409b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4715df0> <<< 11762 1726853251.54779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11762 1726853251.54840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4742f00> <<< 11762 1726853251.54893: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4741c40> <<< 11762 1726853251.54897: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a0ac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11762 1726853251.54966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.54983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11762 1726853251.55009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11762 1726853251.55040: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd476b230> <<< 11762 1726853251.55098: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11762 1726853251.55111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.55128: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11762 1726853251.55150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11762 1726853251.55189: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd478f5f0> <<< 11762 1726853251.55217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11762 1726853251.55256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11762 1726853251.55312: stdout chunk (state=3): >>>import 'ntpath' # <<< 11762 1726853251.55334: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 11762 1726853251.55345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd47f0380> <<< 11762 1726853251.55351: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11762 1726853251.55388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11762 1726853251.55410: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11762 1726853251.55454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11762 1726853251.55544: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd47f2ae0> <<< 11762 1726853251.55623: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd47f04a0> <<< 11762 1726853251.55655: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd47b1370> <<< 11762 1726853251.55688: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 11762 1726853251.55706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4125430> <<< 11762 1726853251.55709: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd478e3f0> <<< 11762 1726853251.55719: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4743e00> <<< 11762 1726853251.55916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11762 1726853251.55926: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fbfd478e750> <<< 11762 1726853251.56413: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_h55yr0yd/ansible_setup_payload.zip' # zipimport: zlib available <<< 11762 1726853251.56492: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.56532: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11762 1726853251.56536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11762 1726853251.56582: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11762 1726853251.56657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11762 1726853251.56770: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd418f170> <<< 11762 1726853251.56789: stdout chunk (state=3): >>>import '_typing' # <<< 11762 1726853251.56997: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd416e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd416d1c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 11762 1726853251.57005: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.57008: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 11762 1726853251.58420: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.59681: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd418d040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11762 1726853251.59737: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11762 1726853251.59763: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd41beab0> <<< 11762 1726853251.59865: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd41be840> <<< 11762 1726853251.59887: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd41be150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd41be5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd418fe00> import 'atexit' # <<< 11762 1726853251.59933: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd41bf860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd41bfaa0> <<< 11762 1726853251.60091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11762 1726853251.60110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd41bffb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11762 1726853251.60127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11762 1726853251.60158: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4029dc0> <<< 11762 1726853251.60204: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd402b9e0> <<< 11762 1726853251.60229: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11762 1726853251.60318: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd402c3b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11762 1726853251.60329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11762 1726853251.60356: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd402d550> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11762 1726853251.60530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd402ffe0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4034320> <<< 11762 1726853251.60555: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd402e2d0> <<< 11762 1726853251.60559: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11762 1726853251.60604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11762 1726853251.60630: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11762 1726853251.60648: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11762 1726853251.60799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11762 1726853251.60813: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4037ec0> import '_tokenize' # <<< 11762 1726853251.60889: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4036990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd40366f0> <<< 11762 1726853251.60921: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11762 1726853251.61022: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4036c60> <<< 11762 1726853251.61025: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd402e7b0> <<< 11762 1726853251.61050: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd407bf20> <<< 11762 1726853251.61083: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd407c200> <<< 11762 1726853251.61121: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 11762 1726853251.61124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11762 1726853251.61144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11762 1726853251.61202: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.61217: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd407dca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd407da60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11762 1726853251.61245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11762 1726853251.61295: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.61314: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4080230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd407e390> <<< 11762 1726853251.61323: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11762 1726853251.61378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.61390: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 11762 1726853251.61404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 11762 1726853251.61412: stdout chunk (state=3): >>>import '_string' # <<< 11762 1726853251.61460: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4083980> <<< 11762 1726853251.61585: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4080380> <<< 11762 1726853251.61649: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4084800> <<< 11762 1726853251.61678: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.61687: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd40849e0> <<< 11762 1726853251.61722: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.61726: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4084350> <<< 11762 1726853251.61735: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd407c320> <<< 11762 1726853251.61758: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 11762 1726853251.61764: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 11762 1726853251.61787: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11762 1726853251.61807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11762 1726853251.61836: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.61859: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3f10500> <<< 11762 1726853251.62015: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3f118e0> <<< 11762 1726853251.62031: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4086c90> <<< 11762 1726853251.62059: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.62081: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4080590> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd40868a0> <<< 11762 1726853251.62098: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.62107: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11762 1726853251.62121: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.62215: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.62303: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.62322: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 11762 1726853251.62349: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11762 1726853251.62358: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 11762 1726853251.62370: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.62492: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.62612: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.63186: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.63756: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11762 1726853251.63791: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 11762 1726853251.63811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.63862: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3f15a60> <<< 11762 1726853251.63957: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11762 1726853251.63983: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f16750> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f11b20> <<< 11762 1726853251.64035: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11762 1726853251.64073: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.64088: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11762 1726853251.64297: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.64414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f16450> <<< 11762 1726853251.64422: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.64873: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.65312: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.65397: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.65475: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11762 1726853251.65479: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.65514: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.65549: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 11762 1726853251.65628: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.65778: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11762 1726853251.65796: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.65836: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11762 1726853251.65839: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.66068: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.66300: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11762 1726853251.66369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11762 1726853251.66374: stdout chunk (state=3): >>>import '_ast' # <<< 11762 1726853251.66453: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f17950> <<< 11762 1726853251.66456: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.66540: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.66613: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 11762 1726853251.66617: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 11762 1726853251.66650: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 11762 1726853251.66696: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.66727: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11762 1726853251.66752: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.66785: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.66824: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.66893: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.66954: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11762 1726853251.66993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.67105: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3f22180> <<< 11762 1726853251.67124: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f1fe30> <<< 11762 1726853251.67162: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11762 1726853251.67473: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11762 1726853251.67506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 11762 1726853251.67518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11762 1726853251.67582: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd400aa80> <<< 11762 1726853251.67616: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd41ea750> <<< 11762 1726853251.67708: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f22270> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f14d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11762 1726853251.67734: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.67755: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.67785: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11762 1726853251.67851: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11762 1726853251.67869: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 11762 1726853251.67892: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.67942: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68027: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68029: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68042: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68115: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68133: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68166: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68199: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 11762 1726853251.68213: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68291: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68357: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68420: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68425: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 11762 1726853251.68427: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68607: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68783: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68820: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.68880: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 11762 1726853251.68885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853251.68916: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 11762 1726853251.68930: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 11762 1726853251.68961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 11762 1726853251.68984: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3fb2660> <<< 11762 1726853251.69020: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 11762 1726853251.69024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11762 1726853251.69050: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11762 1726853251.69082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11762 1726853251.69113: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11762 1726853251.69116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 11762 1726853251.69146: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3b3ffb0> <<< 11762 1726853251.69177: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.69181: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3b443e0> <<< 11762 1726853251.69252: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f9f2c0> <<< 11762 1726853251.69255: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3fb31a0> <<< 11762 1726853251.69310: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3fb0d70> <<< 11762 1726853251.69318: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3fb08c0> <<< 11762 1726853251.69320: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11762 1726853251.69366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11762 1726853251.69411: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11762 1726853251.69425: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 11762 1726853251.69459: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3b47290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3b46b70> <<< 11762 1726853251.69509: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3b46d20> <<< 11762 1726853251.69512: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3b45fd0> <<< 11762 1726853251.69533: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11762 1726853251.69682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 11762 1726853251.69685: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3b47410> <<< 11762 1726853251.69719: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11762 1726853251.69751: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3ba1eb0> <<< 11762 1726853251.69805: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3b47ec0> <<< 11762 1726853251.69847: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3fb0b60> import 'ansible.module_utils.facts.timeout' # <<< 11762 1726853251.69875: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 11762 1726853251.69948: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11762 1726853251.70023: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70059: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70143: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 11762 1726853251.70163: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 11762 1726853251.70188: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70215: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 11762 1726853251.70247: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70280: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70333: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 11762 1726853251.70386: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11762 1726853251.70434: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70493: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70541: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70606: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.70663: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 11762 1726853251.70686: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.71165: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.71610: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 11762 1726853251.71662: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.71716: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.71747: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.71818: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 11762 1726853251.71869: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # <<< 11762 1726853251.71879: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.71936: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.71999: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 11762 1726853251.72025: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.72069: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # <<< 11762 1726853251.72093: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11762 1726853251.72129: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11762 1726853251.72147: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.72229: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.72316: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 11762 1726853251.72331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3ba2150> <<< 11762 1726853251.72362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11762 1726853251.72390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11762 1726853251.72513: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3ba2d20> import 'ansible.module_utils.facts.system.local' # <<< 11762 1726853251.72537: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.72593: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.72679: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 11762 1726853251.72747: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.72838: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 11762 1726853251.72851: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.72909: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.72986: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 11762 1726853251.72993: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.73033: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.73087: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11762 1726853251.73135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11762 1726853251.73204: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.73272: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3be2330> <<< 11762 1726853251.73465: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3bd2180> import 'ansible.module_utils.facts.system.python' # <<< 11762 1726853251.73478: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.73534: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.73595: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 11762 1726853251.73603: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.73691: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.73769: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.73886: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74026: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 11762 1726853251.74043: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 11762 1726853251.74081: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 11762 1726853251.74129: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74172: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74215: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 11762 1726853251.74235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11762 1726853251.74251: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.74276: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853251.74289: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3bf60c0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3bf5ca0> import 'ansible.module_utils.facts.system.user' # <<< 11762 1726853251.74298: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74318: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 11762 1726853251.74341: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74376: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74416: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 11762 1726853251.74423: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74583: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74729: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 11762 1726853251.74741: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74839: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74940: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.74986: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.75023: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 11762 1726853251.75037: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 11762 1726853251.75042: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.75058: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.75085: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.75227: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.75403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 11762 1726853251.75514: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.75632: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11762 1726853251.75657: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.75681: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.75709: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.76270: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.76781: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 11762 1726853251.76791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 11762 1726853251.76799: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.76900: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77011: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 11762 1726853251.77017: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77112: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77213: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11762 1726853251.77219: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77376: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77530: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11762 1726853251.77551: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77554: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77565: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 11762 1726853251.77573: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77619: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 11762 1726853251.77674: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77762: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.77861: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78072: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78280: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 11762 1726853251.78283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 11762 1726853251.78296: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78324: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78372: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 11762 1726853251.78376: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78399: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78418: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 11762 1726853251.78438: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78503: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78577: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 11762 1726853251.78582: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78606: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78634: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 11762 1726853251.78641: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78700: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78757: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 11762 1726853251.78761: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78825: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.78884: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 11762 1726853251.78891: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.79163: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.79428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 11762 1726853251.79497: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.79569: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11762 1726853251.79580: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.79600: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.79633: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 11762 1726853251.79674: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.79717: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 11762 1726853251.79750: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.79798: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 11762 1726853251.79801: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.79879: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.79985: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11762 1726853251.79989: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.80002: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 11762 1726853251.80041: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.80104: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 11762 1726853251.80127: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.80154: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.80187: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.80231: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.80301: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.80389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 11762 1726853251.80412: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 11762 1726853251.80440: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.80507: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 11762 1726853251.80715: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.80914: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11762 1726853251.80918: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.80957: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.81006: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11762 1726853251.81016: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.81065: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.81117: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 11762 1726853251.81121: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.81192: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.81298: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 11762 1726853251.81309: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.81379: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.81473: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11762 1726853251.81557: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853251.82555: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 11762 1726853251.82576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11762 1726853251.82641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11762 1726853251.82647: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd39f3d70> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd39f03e0> <<< 11762 1726853251.82711: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd39f1a90> <<< 11762 1726853251.83089: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "31", "epoch": "1726853251", "epoch_int": "1726853251", "date": "2024-09-20", "time": "13:27:31", "iso8601_micro": "2024-09-20T17:27:31.820395Z", "iso8601": "2024-09-20T17:27:31Z", "iso8601_basic": "20240920T132731820395", "iso8601_basic_short": "20240920T132731", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_cmdline<<< 11762 1726853251.83099: stdout chunk (state=3): >>>": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11762 1726853251.83681: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type <<< 11762 1726853251.83761: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types <<< 11762 1726853251.83768: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 <<< 11762 1726853251.83858: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd <<< 11762 1726853251.83895: stdout chunk (state=3): >>># cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors <<< 11762 1726853251.83988: stdout chunk (state=3): >>># destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other <<< 11762 1726853251.84062: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos<<< 11762 1726853251.84137: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps<<< 11762 1726853251.84140: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme<<< 11762 1726853251.84170: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 11762 1726853251.84568: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11762 1726853251.84616: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 11762 1726853251.84630: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 11762 1726853251.84711: stdout chunk (state=3): >>># destroy ipaddress # destroy ntpath <<< 11762 1726853251.84762: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 11762 1726853251.84765: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 11762 1726853251.84818: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 11762 1726853251.84882: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 11762 1726853251.84935: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 11762 1726853251.84999: stdout chunk (state=3): >>># destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 11762 1726853251.85041: stdout chunk (state=3): >>># destroy _ssl <<< 11762 1726853251.85068: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 11762 1726853251.85089: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 11762 1726853251.85185: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 11762 1726853251.85251: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 11762 1726853251.85332: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 11762 1726853251.85360: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11762 1726853251.85542: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11762 1726853251.85623: stdout chunk (state=3): >>># destroy _collections <<< 11762 1726853251.85634: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 11762 1726853251.85670: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 11762 1726853251.85711: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 11762 1726853251.85729: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11762 1726853251.85822: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 11762 1726853251.85883: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 11762 1726853251.85935: stdout chunk (state=3): >>># destroy _hashlib <<< 11762 1726853251.85942: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 11762 1726853251.85956: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11762 1726853251.86386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853251.86505: stderr chunk (state=3): >>><<< 11762 1726853251.86508: stdout chunk (state=3): >>><<< 11762 1726853251.86594: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4b684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4b37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4b6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd493d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd493dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd497bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd497bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49b3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49b3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4993b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49912b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4979070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49d37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49d23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4992150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49d0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a08890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd49782f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4a08d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a08bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4a08fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4976e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a09670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a09370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a0a540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a20740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4a21e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a22cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4a232f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a22210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4a23d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a234a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a0a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4717c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd47407a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4740500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd47407d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4741100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4741af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd47409b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4715df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4742f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4741c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4a0ac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd476b230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd478f5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd47f0380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd47f2ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd47f04a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd47b1370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4125430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd478e3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4743e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fbfd478e750> # zipimport: found 103 names in '/tmp/ansible_setup_payload_h55yr0yd/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd418f170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd416e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd416d1c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd418d040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd41beab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd41be840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd41be150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd41be5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd418fe00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd41bf860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd41bfaa0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd41bffb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4029dc0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd402b9e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd402c3b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd402d550> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd402ffe0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4034320> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd402e2d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4037ec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4036990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd40366f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4036c60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd402e7b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd407bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd407c200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd407dca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd407da60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4080230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd407e390> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4083980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4080380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4084800> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd40849e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4084350> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd407c320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3f10500> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3f118e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd4086c90> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd4080590> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd40868a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3f15a60> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f16750> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f11b20> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f16450> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f17950> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3f22180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f1fe30> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd400aa80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd41ea750> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f22270> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f14d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3fb2660> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3b3ffb0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3b443e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3f9f2c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3fb31a0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3fb0d70> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3fb08c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3b47290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3b46b70> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3b46d20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3b45fd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3b47410> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3ba1eb0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3b47ec0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3fb0b60> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3ba2150> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3ba2d20> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3be2330> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3bd2180> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd3bf60c0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd3bf5ca0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfd39f3d70> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd39f03e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfd39f1a90> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "31", "epoch": "1726853251", "epoch_int": "1726853251", "date": "2024-09-20", "time": "13:27:31", "iso8601_micro": "2024-09-20T17:27:31.820395Z", "iso8601": "2024-09-20T17:27:31Z", "iso8601_basic": "20240920T132731820395", "iso8601_basic_short": "20240920T132731", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11762 1726853251.87786: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853251.87789: _low_level_execute_command(): starting 11762 1726853251.87791: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853251.3632505-11855-163286145953637/ > /dev/null 2>&1 && sleep 0' 11762 1726853251.87991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853251.88006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853251.88021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853251.88039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853251.88136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853251.88162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853251.88265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853251.90217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853251.90238: stderr chunk (state=3): >>><<< 11762 1726853251.90252: stdout chunk (state=3): >>><<< 11762 1726853251.90476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853251.90480: handler run complete 11762 1726853251.90482: variable 'ansible_facts' from source: unknown 11762 1726853251.90484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853251.90498: variable 'ansible_facts' from source: unknown 11762 1726853251.90545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853251.90614: attempt loop complete, returning result 11762 1726853251.90622: _execute() done 11762 1726853251.90628: dumping result to json 11762 1726853251.90644: done dumping result, returning 11762 1726853251.90656: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-d845-03d0-000000000026] 11762 1726853251.90666: sending task result for task 02083763-bbaf-d845-03d0-000000000026 ok: [managed_node2] 11762 1726853251.91068: no more pending results, returning what we have 11762 1726853251.91073: results queue empty 11762 1726853251.91074: checking for any_errors_fatal 11762 1726853251.91075: done checking for any_errors_fatal 11762 1726853251.91075: checking for max_fail_percentage 11762 1726853251.91077: done checking for max_fail_percentage 11762 1726853251.91078: checking to see if all hosts have failed and the running result is not ok 11762 1726853251.91078: done checking to see if all hosts have failed 11762 1726853251.91079: getting the remaining hosts for this loop 11762 1726853251.91081: done getting the remaining hosts for this loop 11762 1726853251.91084: getting the next task for host managed_node2 11762 1726853251.91092: done getting next task for host managed_node2 11762 1726853251.91094: ^ task is: TASK: Check if system is ostree 11762 1726853251.91097: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853251.91101: getting variables 11762 1726853251.91102: in VariableManager get_vars() 11762 1726853251.91130: Calling all_inventory to load vars for managed_node2 11762 1726853251.91133: Calling groups_inventory to load vars for managed_node2 11762 1726853251.91252: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853251.91264: Calling all_plugins_play to load vars for managed_node2 11762 1726853251.91267: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853251.91270: Calling groups_plugins_play to load vars for managed_node2 11762 1726853251.91567: done sending task result for task 02083763-bbaf-d845-03d0-000000000026 11762 1726853251.91574: WORKER PROCESS EXITING 11762 1726853251.91598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853251.91821: done with get_vars() 11762 1726853251.91832: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:27:31 -0400 (0:00:00.622) 0:00:02.349 ****** 11762 1726853251.91928: entering _queue_task() for managed_node2/stat 11762 1726853251.92183: worker is 1 (out of 1 available) 11762 1726853251.92196: exiting _queue_task() for managed_node2/stat 11762 1726853251.92208: done queuing things up, now waiting for results queue to drain 11762 1726853251.92210: waiting for pending results... 11762 1726853251.92495: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 11762 1726853251.92552: in run() - task 02083763-bbaf-d845-03d0-000000000028 11762 1726853251.92564: variable 'ansible_search_path' from source: unknown 11762 1726853251.92568: variable 'ansible_search_path' from source: unknown 11762 1726853251.92601: calling self._execute() 11762 1726853251.92653: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853251.92658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853251.92670: variable 'omit' from source: magic vars 11762 1726853251.93013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853251.93182: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853251.93213: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853251.93241: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853251.93281: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853251.93346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853251.93363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853251.93385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853251.93402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853251.93490: Evaluated conditional (not __network_is_ostree is defined): True 11762 1726853251.93494: variable 'omit' from source: magic vars 11762 1726853251.93516: variable 'omit' from source: magic vars 11762 1726853251.93541: variable 'omit' from source: magic vars 11762 1726853251.93561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853251.93586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853251.93600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853251.93614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853251.93622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853251.93648: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853251.93651: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853251.93653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853251.93717: Set connection var ansible_timeout to 10 11762 1726853251.93720: Set connection var ansible_shell_type to sh 11762 1726853251.93725: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853251.93730: Set connection var ansible_shell_executable to /bin/sh 11762 1726853251.93737: Set connection var ansible_pipelining to False 11762 1726853251.93742: Set connection var ansible_connection to ssh 11762 1726853251.93760: variable 'ansible_shell_executable' from source: unknown 11762 1726853251.93764: variable 'ansible_connection' from source: unknown 11762 1726853251.93766: variable 'ansible_module_compression' from source: unknown 11762 1726853251.93769: variable 'ansible_shell_type' from source: unknown 11762 1726853251.93773: variable 'ansible_shell_executable' from source: unknown 11762 1726853251.93775: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853251.93777: variable 'ansible_pipelining' from source: unknown 11762 1726853251.93781: variable 'ansible_timeout' from source: unknown 11762 1726853251.93783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853251.93882: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853251.93890: variable 'omit' from source: magic vars 11762 1726853251.93894: starting attempt loop 11762 1726853251.93898: running the handler 11762 1726853251.93910: _low_level_execute_command(): starting 11762 1726853251.93916: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853251.94407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853251.94410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853251.94413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853251.94415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853251.94469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853251.94477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853251.94549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853251.96287: stdout chunk (state=3): >>>/root <<< 11762 1726853251.96379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853251.96387: stdout chunk (state=3): >>><<< 11762 1726853251.96394: stderr chunk (state=3): >>><<< 11762 1726853251.96434: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853251.96443: _low_level_execute_command(): starting 11762 1726853251.96449: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533 `" && echo ansible-tmp-1726853251.9642293-11882-93250814555533="` echo /root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533 `" ) && sleep 0' 11762 1726853251.96858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853251.96862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853251.96878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853251.96923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853251.96927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853251.97001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853251.99009: stdout chunk (state=3): >>>ansible-tmp-1726853251.9642293-11882-93250814555533=/root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533 <<< 11762 1726853251.99281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853251.99284: stdout chunk (state=3): >>><<< 11762 1726853251.99286: stderr chunk (state=3): >>><<< 11762 1726853251.99289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853251.9642293-11882-93250814555533=/root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853251.99291: variable 'ansible_module_compression' from source: unknown 11762 1726853251.99293: ANSIBALLZ: Using lock for stat 11762 1726853251.99295: ANSIBALLZ: Acquiring lock 11762 1726853251.99297: ANSIBALLZ: Lock acquired: 139956166285776 11762 1726853251.99299: ANSIBALLZ: Creating module 11762 1726853252.08453: ANSIBALLZ: Writing module into payload 11762 1726853252.08551: ANSIBALLZ: Writing module 11762 1726853252.08579: ANSIBALLZ: Renaming module 11762 1726853252.08589: ANSIBALLZ: Done creating module 11762 1726853252.08610: variable 'ansible_facts' from source: unknown 11762 1726853252.08691: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533/AnsiballZ_stat.py 11762 1726853252.08912: Sending initial data 11762 1726853252.08921: Sent initial data (152 bytes) 11762 1726853252.09498: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853252.09521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853252.09624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853252.11296: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11762 1726853252.11320: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853252.11415: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853252.11484: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp9djf5lbo /root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533/AnsiballZ_stat.py <<< 11762 1726853252.11504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533/AnsiballZ_stat.py" <<< 11762 1726853252.11575: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp9djf5lbo" to remote "/root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533/AnsiballZ_stat.py" <<< 11762 1726853252.12435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853252.12475: stderr chunk (state=3): >>><<< 11762 1726853252.12485: stdout chunk (state=3): >>><<< 11762 1726853252.12534: done transferring module to remote 11762 1726853252.12548: _low_level_execute_command(): starting 11762 1726853252.12628: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533/ /root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533/AnsiballZ_stat.py && sleep 0' 11762 1726853252.13207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853252.13285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853252.13323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853252.13337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853252.13355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853252.13506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853252.15374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853252.15385: stdout chunk (state=3): >>><<< 11762 1726853252.15395: stderr chunk (state=3): >>><<< 11762 1726853252.15422: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853252.15511: _low_level_execute_command(): starting 11762 1726853252.15517: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533/AnsiballZ_stat.py && sleep 0' 11762 1726853252.16108: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853252.16122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853252.16137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853252.16153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853252.16189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853252.16287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853252.16311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853252.16422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853252.18727: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11762 1726853252.18767: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 11762 1726853252.18838: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11762 1726853252.18863: stdout chunk (state=3): >>>import 'posix' # <<< 11762 1726853252.18919: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11762 1726853252.18949: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 11762 1726853252.19007: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 11762 1726853252.19011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 11762 1726853252.19034: stdout chunk (state=3): >>>import 'codecs' # <<< 11762 1726853252.19083: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11762 1726853252.19169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88fbc4d0> <<< 11762 1726853252.19183: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88f8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88fbea50> import '_signal' # <<< 11762 1726853252.19313: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # <<< 11762 1726853252.19318: stdout chunk (state=3): >>>import '_stat' # <<< 11762 1726853252.19481: stdout chunk (state=3): >>>import 'stat' # <<< 11762 1726853252.19485: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88fcd130> <<< 11762 1726853252.19556: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853252.19579: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88fcdfa0> <<< 11762 1726853252.19619: stdout chunk (state=3): >>>import 'site' # <<< 11762 1726853252.19630: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11762 1726853252.19855: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11762 1726853252.19884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11762 1726853252.19901: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11762 1726853252.19958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11762 1726853252.19975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11762 1726853252.20005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88dcbe90> <<< 11762 1726853252.20038: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11762 1726853252.20072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11762 1726853252.20085: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88dcbf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11762 1726853252.20115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11762 1726853252.20137: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11762 1726853252.20204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853252.20211: stdout chunk (state=3): >>>import 'itertools' # <<< 11762 1726853252.20269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e03890> <<< 11762 1726853252.20274: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 11762 1726853252.20301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e03f20> <<< 11762 1726853252.20304: stdout chunk (state=3): >>>import '_collections' # <<< 11762 1726853252.20336: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88de3b60> import '_functools' # <<< 11762 1726853252.20372: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88de1280> <<< 11762 1726853252.20461: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88dc9040> <<< 11762 1726853252.20508: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11762 1726853252.20533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 11762 1726853252.20558: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11762 1726853252.20586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11762 1726853252.20597: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11762 1726853252.20634: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e23800> <<< 11762 1726853252.20668: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e22420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88de2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e20b60> <<< 11762 1726853252.20739: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 11762 1726853252.20775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e58860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88dc82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11762 1726853252.20800: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88e58d10> <<< 11762 1726853252.20847: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e58bc0> <<< 11762 1726853252.20850: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853252.20890: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88e58f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88dc6de0> <<< 11762 1726853252.20904: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11762 1726853252.20949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11762 1726853252.20962: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e59610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e592e0> import 'importlib.machinery' # <<< 11762 1726853252.21018: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11762 1726853252.21021: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e5a510> <<< 11762 1726853252.21056: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 11762 1726853252.21059: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11762 1726853252.21125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 11762 1726853252.21128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e70710> <<< 11762 1726853252.21182: stdout chunk (state=3): >>>import 'errno' # <<< 11762 1726853252.21186: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853252.21225: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88e71df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11762 1726853252.21237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 11762 1726853252.21260: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e72c90> <<< 11762 1726853252.21312: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88e732f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e721e0> <<< 11762 1726853252.21316: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 11762 1726853252.21368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11762 1726853252.21374: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853252.21383: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88e73d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e734a0> <<< 11762 1726853252.21417: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e5a540> <<< 11762 1726853252.21458: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11762 1726853252.21475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11762 1726853252.21501: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11762 1726853252.21555: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853252.21580: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88befc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11762 1726853252.21606: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88c187a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c18500> <<< 11762 1726853252.21655: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88c18710> <<< 11762 1726853252.21658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11762 1726853252.21726: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853252.21859: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88c190d0> <<< 11762 1726853252.22003: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88c19ac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c18980> <<< 11762 1726853252.22031: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88beddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11762 1726853252.22055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11762 1726853252.22108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 11762 1726853252.22111: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c1aea0> <<< 11762 1726853252.22143: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c19be0> <<< 11762 1726853252.22173: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e5ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11762 1726853252.22279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853252.22282: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11762 1726853252.22316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11762 1726853252.22319: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c431d0> <<< 11762 1726853252.22393: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11762 1726853252.22396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853252.22419: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11762 1726853252.22481: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c67590> <<< 11762 1726853252.22492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11762 1726853252.22534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11762 1726853252.22689: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88cc82f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11762 1726853252.22692: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11762 1726853252.22728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11762 1726853252.22812: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88ccaa20> <<< 11762 1726853252.22898: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88cc83e0> <<< 11762 1726853252.22920: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c8d310> <<< 11762 1726853252.22949: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885253a0> <<< 11762 1726853252.22973: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c66390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c1bda0> <<< 11762 1726853252.23118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11762 1726853252.23121: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdb88c66990> <<< 11762 1726853252.23273: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_vxui_iqv/ansible_stat_payload.zip' <<< 11762 1726853252.23276: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.23397: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.23423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11762 1726853252.23448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11762 1726853252.23483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11762 1726853252.23556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11762 1726853252.23592: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8857b080> import '_typing' # <<< 11762 1726853252.24008: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88559f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88559100> # zipimport: zlib available <<< 11762 1726853252.24011: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 11762 1726853252.25293: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.26529: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88578f20> <<< 11762 1726853252.26542: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853252.26546: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11762 1726853252.26549: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11762 1726853252.26574: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb885a29c0> <<< 11762 1726853252.26600: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885a2780> <<< 11762 1726853252.26653: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885a20f0> <<< 11762 1726853252.26705: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11762 1726853252.26757: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885a2ae0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8857bb00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb885a3710> <<< 11762 1726853252.26785: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb885a3950> <<< 11762 1726853252.26833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11762 1726853252.26870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11762 1726853252.26890: stdout chunk (state=3): >>>import '_locale' # <<< 11762 1726853252.26929: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885a3e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11762 1726853252.26969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11762 1726853252.27043: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8840dbe0> <<< 11762 1726853252.27052: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8840f7d0> <<< 11762 1726853252.27079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11762 1726853252.27111: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884141d0> <<< 11762 1726853252.27146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11762 1726853252.27184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11762 1726853252.27198: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88415340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11762 1726853252.27250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11762 1726853252.27269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11762 1726853252.27320: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88417e00> <<< 11762 1726853252.27381: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88559070> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884160c0> <<< 11762 1726853252.27417: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11762 1726853252.27476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11762 1726853252.27500: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11762 1726853252.27567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8841bce0> import '_tokenize' # <<< 11762 1726853252.27654: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8841a7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8841a510> <<< 11762 1726853252.27665: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11762 1726853252.27766: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8841aa80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884165d0> <<< 11762 1726853252.27850: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88463fb0> <<< 11762 1726853252.27881: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88464110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11762 1726853252.27907: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11762 1726853252.27948: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88465bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88465970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11762 1726853252.28077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11762 1726853252.28136: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853252.28168: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb884680e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88466270> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11762 1726853252.28291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 11762 1726853252.28400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8846b830> <<< 11762 1726853252.28438: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88468200> <<< 11762 1726853252.28916: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8846c5f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8846c860> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8846cbf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884642f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb884f82f0> <<< 11762 1726853252.28920: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853252.28922: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb884f9910> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8846ea80> <<< 11762 1726853252.28924: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 11762 1726853252.28926: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8846fe30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8846e6c0> # zipimport: zlib available <<< 11762 1726853252.28952: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11762 1726853252.29031: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.29129: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.29161: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available <<< 11762 1726853252.29227: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11762 1726853252.29295: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.29415: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.29976: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.30512: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 11762 1726853252.30545: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11762 1726853252.30559: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 11762 1726853252.30582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853252.30624: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 11762 1726853252.30639: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb884fdbb0> <<< 11762 1726853252.30707: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11762 1726853252.30735: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884fe840> <<< 11762 1726853252.30738: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884f9a60> <<< 11762 1726853252.30808: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11762 1726853252.30812: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.30841: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 11762 1726853252.30844: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.31049: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.31228: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884fe540> # zipimport: zlib available <<< 11762 1726853252.31635: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.32070: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.32142: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.32218: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11762 1726853252.32222: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.32264: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.32298: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11762 1726853252.32312: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.32379: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.32458: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11762 1726853252.32468: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.32490: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 11762 1726853252.32505: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.32541: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.32588: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11762 1726853252.32825: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.33060: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11762 1726853252.33131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11762 1726853252.33136: stdout chunk (state=3): >>>import '_ast' # <<< 11762 1726853252.33217: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884ffa70> <<< 11762 1726853252.33223: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.33297: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.33372: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 11762 1726853252.33386: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 11762 1726853252.33394: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 11762 1726853252.33407: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.33454: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.33487: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11762 1726853252.33505: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.33541: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.33590: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.33644: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.33718: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11762 1726853252.33765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853252.33960: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8830a450> <<< 11762 1726853252.34150: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88305340> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11762 1726853252.34155: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11762 1726853252.34203: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11762 1726853252.34206: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11762 1726853252.34407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885dec60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885ee930> <<< 11762 1726853252.34513: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8830a1b0> <<< 11762 1726853252.34549: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8846d970> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 11762 1726853252.34576: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11762 1726853252.34732: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 11762 1726853252.34831: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.35019: stdout chunk (state=3): >>># zipimport: zlib available <<< 11762 1726853252.35167: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 11762 1726853252.35527: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr<<< 11762 1726853252.35557: stdout chunk (state=3): >>> # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib <<< 11762 1726853252.35638: stdout chunk (state=3): >>># destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib <<< 11762 1726853252.35646: stdout chunk (state=3): >>># cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime <<< 11762 1726853252.35720: stdout chunk (state=3): >>># cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules <<< 11762 1726853252.35746: stdout chunk (state=3): >>># destroy ansible.modules <<< 11762 1726853252.35983: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11762 1726853252.36056: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11762 1726853252.36063: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 11762 1726853252.36134: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 11762 1726853252.36189: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 11762 1726853252.36218: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime <<< 11762 1726853252.36233: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 11762 1726853252.36341: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 11762 1726853252.36349: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 11762 1726853252.36458: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 11762 1726853252.36461: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 11762 1726853252.36473: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11762 1726853252.36608: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11762 1726853252.36669: stdout chunk (state=3): >>># destroy _collections <<< 11762 1726853252.36673: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 11762 1726853252.36757: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator <<< 11762 1726853252.36761: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 11762 1726853252.36778: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11762 1726853252.36865: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11762 1726853252.36940: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 11762 1726853252.36948: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 11762 1726853252.36975: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11762 1726853252.37376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853252.37427: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 11762 1726853252.37439: stderr chunk (state=3): >>><<< 11762 1726853252.37452: stdout chunk (state=3): >>><<< 11762 1726853252.37587: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88fbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88f8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88fbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88fcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88fcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88dcbe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88dcbf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e03890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e03f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88de3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88de1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88dc9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e23800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e22420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88de2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e20b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e58860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88dc82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88e58d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e58bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88e58f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88dc6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e59610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e592e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e5a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e70710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88e71df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e72c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88e732f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e721e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88e73d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e734a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e5a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88befc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88c187a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c18500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88c18710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88c190d0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88c19ac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c18980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88beddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c1aea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c19be0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88e5ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c431d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c67590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88cc82f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88ccaa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88cc83e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c8d310> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885253a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c66390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88c1bda0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdb88c66990> # zipimport: found 30 names in '/tmp/ansible_stat_payload_vxui_iqv/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8857b080> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88559f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88559100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88578f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb885a29c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885a2780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885a20f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885a2ae0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8857bb00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb885a3710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb885a3950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885a3e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8840dbe0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8840f7d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884141d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88415340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88417e00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88559070> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884160c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8841bce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8841a7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8841a510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8841aa80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884165d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88463fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88464110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb88465bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88465970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb884680e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88466270> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8846b830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88468200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8846c5f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8846c860> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8846cbf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884642f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb884f82f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb884f9910> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8846ea80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8846fe30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8846e6c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb884fdbb0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884fe840> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884f9a60> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884fe540> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb884ffa70> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdb8830a450> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb88305340> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885dec60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb885ee930> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8830a1b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdb8846d970> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11762 1726853252.38868: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853252.38873: _low_level_execute_command(): starting 11762 1726853252.38876: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853251.9642293-11882-93250814555533/ > /dev/null 2>&1 && sleep 0' 11762 1726853252.39291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853252.39430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853252.39450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853252.39476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853252.39607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853252.41543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853252.41580: stderr chunk (state=3): >>><<< 11762 1726853252.41591: stdout chunk (state=3): >>><<< 11762 1726853252.41614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853252.41626: handler run complete 11762 1726853252.41651: attempt loop complete, returning result 11762 1726853252.41659: _execute() done 11762 1726853252.41665: dumping result to json 11762 1726853252.41675: done dumping result, returning 11762 1726853252.41686: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [02083763-bbaf-d845-03d0-000000000028] 11762 1726853252.41695: sending task result for task 02083763-bbaf-d845-03d0-000000000028 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11762 1726853252.41853: no more pending results, returning what we have 11762 1726853252.41856: results queue empty 11762 1726853252.41857: checking for any_errors_fatal 11762 1726853252.41863: done checking for any_errors_fatal 11762 1726853252.41863: checking for max_fail_percentage 11762 1726853252.41865: done checking for max_fail_percentage 11762 1726853252.41866: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.41866: done checking to see if all hosts have failed 11762 1726853252.41867: getting the remaining hosts for this loop 11762 1726853252.41869: done getting the remaining hosts for this loop 11762 1726853252.41874: getting the next task for host managed_node2 11762 1726853252.41881: done getting next task for host managed_node2 11762 1726853252.41883: ^ task is: TASK: Set flag to indicate system is ostree 11762 1726853252.41886: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.41889: getting variables 11762 1726853252.41891: in VariableManager get_vars() 11762 1726853252.41921: Calling all_inventory to load vars for managed_node2 11762 1726853252.41924: Calling groups_inventory to load vars for managed_node2 11762 1726853252.41928: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.41939: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.41942: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.41945: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.42401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.42667: done with get_vars() 11762 1726853252.42681: done getting variables 11762 1726853252.42716: done sending task result for task 02083763-bbaf-d845-03d0-000000000028 11762 1726853252.42720: WORKER PROCESS EXITING 11762 1726853252.42793: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:27:32 -0400 (0:00:00.508) 0:00:02.858 ****** 11762 1726853252.42818: entering _queue_task() for managed_node2/set_fact 11762 1726853252.42820: Creating lock for set_fact 11762 1726853252.43088: worker is 1 (out of 1 available) 11762 1726853252.43102: exiting _queue_task() for managed_node2/set_fact 11762 1726853252.43113: done queuing things up, now waiting for results queue to drain 11762 1726853252.43115: waiting for pending results... 11762 1726853252.43358: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 11762 1726853252.43458: in run() - task 02083763-bbaf-d845-03d0-000000000029 11762 1726853252.43484: variable 'ansible_search_path' from source: unknown 11762 1726853252.43492: variable 'ansible_search_path' from source: unknown 11762 1726853252.43534: calling self._execute() 11762 1726853252.43614: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.43626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.43640: variable 'omit' from source: magic vars 11762 1726853252.44132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853252.44437: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853252.44491: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853252.44528: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853252.44574: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853252.44990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853252.44994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853252.44996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853252.44998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853252.45099: Evaluated conditional (not __network_is_ostree is defined): True 11762 1726853252.45229: variable 'omit' from source: magic vars 11762 1726853252.45281: variable 'omit' from source: magic vars 11762 1726853252.45584: variable '__ostree_booted_stat' from source: set_fact 11762 1726853252.45639: variable 'omit' from source: magic vars 11762 1726853252.45704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853252.45796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853252.45820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853252.45914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853252.45941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853252.46013: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853252.46069: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.46082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.46238: Set connection var ansible_timeout to 10 11762 1726853252.46249: Set connection var ansible_shell_type to sh 11762 1726853252.46262: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853252.46275: Set connection var ansible_shell_executable to /bin/sh 11762 1726853252.46289: Set connection var ansible_pipelining to False 11762 1726853252.46299: Set connection var ansible_connection to ssh 11762 1726853252.46330: variable 'ansible_shell_executable' from source: unknown 11762 1726853252.46338: variable 'ansible_connection' from source: unknown 11762 1726853252.46345: variable 'ansible_module_compression' from source: unknown 11762 1726853252.46351: variable 'ansible_shell_type' from source: unknown 11762 1726853252.46358: variable 'ansible_shell_executable' from source: unknown 11762 1726853252.46365: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.46388: variable 'ansible_pipelining' from source: unknown 11762 1726853252.46420: variable 'ansible_timeout' from source: unknown 11762 1726853252.46423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.46512: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853252.46530: variable 'omit' from source: magic vars 11762 1726853252.46539: starting attempt loop 11762 1726853252.46545: running the handler 11762 1726853252.46637: handler run complete 11762 1726853252.46640: attempt loop complete, returning result 11762 1726853252.46642: _execute() done 11762 1726853252.46644: dumping result to json 11762 1726853252.46646: done dumping result, returning 11762 1726853252.46648: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [02083763-bbaf-d845-03d0-000000000029] 11762 1726853252.46650: sending task result for task 02083763-bbaf-d845-03d0-000000000029 11762 1726853252.46714: done sending task result for task 02083763-bbaf-d845-03d0-000000000029 11762 1726853252.46716: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11762 1726853252.46795: no more pending results, returning what we have 11762 1726853252.46798: results queue empty 11762 1726853252.46799: checking for any_errors_fatal 11762 1726853252.46804: done checking for any_errors_fatal 11762 1726853252.46805: checking for max_fail_percentage 11762 1726853252.46807: done checking for max_fail_percentage 11762 1726853252.46808: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.46809: done checking to see if all hosts have failed 11762 1726853252.46809: getting the remaining hosts for this loop 11762 1726853252.46811: done getting the remaining hosts for this loop 11762 1726853252.46815: getting the next task for host managed_node2 11762 1726853252.46823: done getting next task for host managed_node2 11762 1726853252.46825: ^ task is: TASK: Fix CentOS6 Base repo 11762 1726853252.46828: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.46832: getting variables 11762 1726853252.46833: in VariableManager get_vars() 11762 1726853252.46866: Calling all_inventory to load vars for managed_node2 11762 1726853252.46870: Calling groups_inventory to load vars for managed_node2 11762 1726853252.46893: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.46904: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.46908: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.46916: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.47253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.47579: done with get_vars() 11762 1726853252.47590: done getting variables 11762 1726853252.47705: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:27:32 -0400 (0:00:00.049) 0:00:02.907 ****** 11762 1726853252.47732: entering _queue_task() for managed_node2/copy 11762 1726853252.48065: worker is 1 (out of 1 available) 11762 1726853252.48080: exiting _queue_task() for managed_node2/copy 11762 1726853252.48096: done queuing things up, now waiting for results queue to drain 11762 1726853252.48098: waiting for pending results... 11762 1726853252.48247: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 11762 1726853252.48298: in run() - task 02083763-bbaf-d845-03d0-00000000002b 11762 1726853252.48312: variable 'ansible_search_path' from source: unknown 11762 1726853252.48317: variable 'ansible_search_path' from source: unknown 11762 1726853252.48350: calling self._execute() 11762 1726853252.48401: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.48405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.48415: variable 'omit' from source: magic vars 11762 1726853252.48747: variable 'ansible_distribution' from source: facts 11762 1726853252.48760: Evaluated conditional (ansible_distribution == 'CentOS'): True 11762 1726853252.48840: variable 'ansible_distribution_major_version' from source: facts 11762 1726853252.48847: Evaluated conditional (ansible_distribution_major_version == '6'): False 11762 1726853252.48850: when evaluation is False, skipping this task 11762 1726853252.48853: _execute() done 11762 1726853252.48855: dumping result to json 11762 1726853252.48863: done dumping result, returning 11762 1726853252.48867: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [02083763-bbaf-d845-03d0-00000000002b] 11762 1726853252.48869: sending task result for task 02083763-bbaf-d845-03d0-00000000002b 11762 1726853252.48954: done sending task result for task 02083763-bbaf-d845-03d0-00000000002b 11762 1726853252.48957: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11762 1726853252.49036: no more pending results, returning what we have 11762 1726853252.49039: results queue empty 11762 1726853252.49039: checking for any_errors_fatal 11762 1726853252.49043: done checking for any_errors_fatal 11762 1726853252.49043: checking for max_fail_percentage 11762 1726853252.49047: done checking for max_fail_percentage 11762 1726853252.49047: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.49048: done checking to see if all hosts have failed 11762 1726853252.49049: getting the remaining hosts for this loop 11762 1726853252.49050: done getting the remaining hosts for this loop 11762 1726853252.49053: getting the next task for host managed_node2 11762 1726853252.49057: done getting next task for host managed_node2 11762 1726853252.49059: ^ task is: TASK: Include the task 'enable_epel.yml' 11762 1726853252.49062: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.49065: getting variables 11762 1726853252.49066: in VariableManager get_vars() 11762 1726853252.49092: Calling all_inventory to load vars for managed_node2 11762 1726853252.49094: Calling groups_inventory to load vars for managed_node2 11762 1726853252.49097: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.49105: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.49107: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.49116: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.49278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.49472: done with get_vars() 11762 1726853252.49482: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:27:32 -0400 (0:00:00.018) 0:00:02.926 ****** 11762 1726853252.49568: entering _queue_task() for managed_node2/include_tasks 11762 1726853252.49961: worker is 1 (out of 1 available) 11762 1726853252.50178: exiting _queue_task() for managed_node2/include_tasks 11762 1726853252.50189: done queuing things up, now waiting for results queue to drain 11762 1726853252.50191: waiting for pending results... 11762 1726853252.50358: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 11762 1726853252.50481: in run() - task 02083763-bbaf-d845-03d0-00000000002c 11762 1726853252.50486: variable 'ansible_search_path' from source: unknown 11762 1726853252.50490: variable 'ansible_search_path' from source: unknown 11762 1726853252.50530: calling self._execute() 11762 1726853252.50609: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.50613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.50627: variable 'omit' from source: magic vars 11762 1726853252.51320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853252.53004: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853252.53057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853252.53084: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853252.53123: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853252.53140: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853252.53201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853252.53220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853252.53240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853252.53275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853252.53283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853252.53365: variable '__network_is_ostree' from source: set_fact 11762 1726853252.53382: Evaluated conditional (not __network_is_ostree | d(false)): True 11762 1726853252.53387: _execute() done 11762 1726853252.53390: dumping result to json 11762 1726853252.53392: done dumping result, returning 11762 1726853252.53399: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-d845-03d0-00000000002c] 11762 1726853252.53403: sending task result for task 02083763-bbaf-d845-03d0-00000000002c 11762 1726853252.53640: done sending task result for task 02083763-bbaf-d845-03d0-00000000002c 11762 1726853252.53646: WORKER PROCESS EXITING 11762 1726853252.53688: no more pending results, returning what we have 11762 1726853252.53693: in VariableManager get_vars() 11762 1726853252.53722: Calling all_inventory to load vars for managed_node2 11762 1726853252.53725: Calling groups_inventory to load vars for managed_node2 11762 1726853252.53727: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.53737: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.53739: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.53741: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.53982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.54181: done with get_vars() 11762 1726853252.54189: variable 'ansible_search_path' from source: unknown 11762 1726853252.54190: variable 'ansible_search_path' from source: unknown 11762 1726853252.54225: we have included files to process 11762 1726853252.54226: generating all_blocks data 11762 1726853252.54228: done generating all_blocks data 11762 1726853252.54233: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11762 1726853252.54243: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11762 1726853252.54246: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11762 1726853252.54857: done processing included file 11762 1726853252.54859: iterating over new_blocks loaded from include file 11762 1726853252.54860: in VariableManager get_vars() 11762 1726853252.54868: done with get_vars() 11762 1726853252.54869: filtering new block on tags 11762 1726853252.54885: done filtering new block on tags 11762 1726853252.54887: in VariableManager get_vars() 11762 1726853252.54897: done with get_vars() 11762 1726853252.54898: filtering new block on tags 11762 1726853252.54905: done filtering new block on tags 11762 1726853252.54906: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 11762 1726853252.54910: extending task lists for all hosts with included blocks 11762 1726853252.54973: done extending task lists 11762 1726853252.54974: done processing included files 11762 1726853252.54978: results queue empty 11762 1726853252.54979: checking for any_errors_fatal 11762 1726853252.54981: done checking for any_errors_fatal 11762 1726853252.54981: checking for max_fail_percentage 11762 1726853252.54982: done checking for max_fail_percentage 11762 1726853252.54982: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.54983: done checking to see if all hosts have failed 11762 1726853252.54983: getting the remaining hosts for this loop 11762 1726853252.54984: done getting the remaining hosts for this loop 11762 1726853252.54986: getting the next task for host managed_node2 11762 1726853252.54989: done getting next task for host managed_node2 11762 1726853252.54990: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11762 1726853252.54992: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.54993: getting variables 11762 1726853252.54994: in VariableManager get_vars() 11762 1726853252.55001: Calling all_inventory to load vars for managed_node2 11762 1726853252.55003: Calling groups_inventory to load vars for managed_node2 11762 1726853252.55005: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.55009: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.55015: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.55017: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.55117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.55230: done with get_vars() 11762 1726853252.55237: done getting variables 11762 1726853252.55287: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11762 1726853252.55422: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:27:32 -0400 (0:00:00.058) 0:00:02.985 ****** 11762 1726853252.55459: entering _queue_task() for managed_node2/command 11762 1726853252.55460: Creating lock for command 11762 1726853252.55689: worker is 1 (out of 1 available) 11762 1726853252.55703: exiting _queue_task() for managed_node2/command 11762 1726853252.55714: done queuing things up, now waiting for results queue to drain 11762 1726853252.55716: waiting for pending results... 11762 1726853252.55866: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 11762 1726853252.55929: in run() - task 02083763-bbaf-d845-03d0-000000000046 11762 1726853252.55943: variable 'ansible_search_path' from source: unknown 11762 1726853252.55949: variable 'ansible_search_path' from source: unknown 11762 1726853252.55973: calling self._execute() 11762 1726853252.56025: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.56030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.56038: variable 'omit' from source: magic vars 11762 1726853252.56309: variable 'ansible_distribution' from source: facts 11762 1726853252.56318: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11762 1726853252.56404: variable 'ansible_distribution_major_version' from source: facts 11762 1726853252.56409: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11762 1726853252.56411: when evaluation is False, skipping this task 11762 1726853252.56415: _execute() done 11762 1726853252.56417: dumping result to json 11762 1726853252.56419: done dumping result, returning 11762 1726853252.56427: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [02083763-bbaf-d845-03d0-000000000046] 11762 1726853252.56432: sending task result for task 02083763-bbaf-d845-03d0-000000000046 11762 1726853252.56523: done sending task result for task 02083763-bbaf-d845-03d0-000000000046 11762 1726853252.56526: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11762 1726853252.56586: no more pending results, returning what we have 11762 1726853252.56589: results queue empty 11762 1726853252.56590: checking for any_errors_fatal 11762 1726853252.56591: done checking for any_errors_fatal 11762 1726853252.56592: checking for max_fail_percentage 11762 1726853252.56593: done checking for max_fail_percentage 11762 1726853252.56594: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.56595: done checking to see if all hosts have failed 11762 1726853252.56595: getting the remaining hosts for this loop 11762 1726853252.56597: done getting the remaining hosts for this loop 11762 1726853252.56600: getting the next task for host managed_node2 11762 1726853252.56606: done getting next task for host managed_node2 11762 1726853252.56608: ^ task is: TASK: Install yum-utils package 11762 1726853252.56611: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.56614: getting variables 11762 1726853252.56615: in VariableManager get_vars() 11762 1726853252.56639: Calling all_inventory to load vars for managed_node2 11762 1726853252.56641: Calling groups_inventory to load vars for managed_node2 11762 1726853252.56646: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.56655: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.56658: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.56660: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.56920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.57031: done with get_vars() 11762 1726853252.57038: done getting variables 11762 1726853252.57109: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:27:32 -0400 (0:00:00.016) 0:00:03.001 ****** 11762 1726853252.57131: entering _queue_task() for managed_node2/package 11762 1726853252.57132: Creating lock for package 11762 1726853252.57337: worker is 1 (out of 1 available) 11762 1726853252.57350: exiting _queue_task() for managed_node2/package 11762 1726853252.57363: done queuing things up, now waiting for results queue to drain 11762 1726853252.57365: waiting for pending results... 11762 1726853252.57524: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 11762 1726853252.57593: in run() - task 02083763-bbaf-d845-03d0-000000000047 11762 1726853252.57608: variable 'ansible_search_path' from source: unknown 11762 1726853252.57611: variable 'ansible_search_path' from source: unknown 11762 1726853252.57639: calling self._execute() 11762 1726853252.57705: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.57709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.57717: variable 'omit' from source: magic vars 11762 1726853252.57992: variable 'ansible_distribution' from source: facts 11762 1726853252.58001: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11762 1726853252.58092: variable 'ansible_distribution_major_version' from source: facts 11762 1726853252.58095: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11762 1726853252.58099: when evaluation is False, skipping this task 11762 1726853252.58102: _execute() done 11762 1726853252.58105: dumping result to json 11762 1726853252.58107: done dumping result, returning 11762 1726853252.58114: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [02083763-bbaf-d845-03d0-000000000047] 11762 1726853252.58119: sending task result for task 02083763-bbaf-d845-03d0-000000000047 11762 1726853252.58208: done sending task result for task 02083763-bbaf-d845-03d0-000000000047 11762 1726853252.58212: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11762 1726853252.58276: no more pending results, returning what we have 11762 1726853252.58281: results queue empty 11762 1726853252.58282: checking for any_errors_fatal 11762 1726853252.58287: done checking for any_errors_fatal 11762 1726853252.58288: checking for max_fail_percentage 11762 1726853252.58289: done checking for max_fail_percentage 11762 1726853252.58290: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.58290: done checking to see if all hosts have failed 11762 1726853252.58291: getting the remaining hosts for this loop 11762 1726853252.58292: done getting the remaining hosts for this loop 11762 1726853252.58295: getting the next task for host managed_node2 11762 1726853252.58300: done getting next task for host managed_node2 11762 1726853252.58302: ^ task is: TASK: Enable EPEL 7 11762 1726853252.58305: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.58307: getting variables 11762 1726853252.58308: in VariableManager get_vars() 11762 1726853252.58331: Calling all_inventory to load vars for managed_node2 11762 1726853252.58333: Calling groups_inventory to load vars for managed_node2 11762 1726853252.58335: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.58344: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.58347: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.58350: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.58459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.58581: done with get_vars() 11762 1726853252.58588: done getting variables 11762 1726853252.58630: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:27:32 -0400 (0:00:00.015) 0:00:03.016 ****** 11762 1726853252.58649: entering _queue_task() for managed_node2/command 11762 1726853252.58841: worker is 1 (out of 1 available) 11762 1726853252.58854: exiting _queue_task() for managed_node2/command 11762 1726853252.58867: done queuing things up, now waiting for results queue to drain 11762 1726853252.58869: waiting for pending results... 11762 1726853252.59022: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 11762 1726853252.59089: in run() - task 02083763-bbaf-d845-03d0-000000000048 11762 1726853252.59106: variable 'ansible_search_path' from source: unknown 11762 1726853252.59109: variable 'ansible_search_path' from source: unknown 11762 1726853252.59132: calling self._execute() 11762 1726853252.59190: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.59194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.59207: variable 'omit' from source: magic vars 11762 1726853252.59485: variable 'ansible_distribution' from source: facts 11762 1726853252.59495: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11762 1726853252.59585: variable 'ansible_distribution_major_version' from source: facts 11762 1726853252.59588: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11762 1726853252.59591: when evaluation is False, skipping this task 11762 1726853252.59596: _execute() done 11762 1726853252.59598: dumping result to json 11762 1726853252.59600: done dumping result, returning 11762 1726853252.59608: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [02083763-bbaf-d845-03d0-000000000048] 11762 1726853252.59613: sending task result for task 02083763-bbaf-d845-03d0-000000000048 11762 1726853252.59694: done sending task result for task 02083763-bbaf-d845-03d0-000000000048 11762 1726853252.59696: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11762 1726853252.59740: no more pending results, returning what we have 11762 1726853252.59743: results queue empty 11762 1726853252.59744: checking for any_errors_fatal 11762 1726853252.59749: done checking for any_errors_fatal 11762 1726853252.59750: checking for max_fail_percentage 11762 1726853252.59752: done checking for max_fail_percentage 11762 1726853252.59752: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.59753: done checking to see if all hosts have failed 11762 1726853252.59754: getting the remaining hosts for this loop 11762 1726853252.59755: done getting the remaining hosts for this loop 11762 1726853252.59759: getting the next task for host managed_node2 11762 1726853252.59765: done getting next task for host managed_node2 11762 1726853252.59767: ^ task is: TASK: Enable EPEL 8 11762 1726853252.59772: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.59775: getting variables 11762 1726853252.59777: in VariableManager get_vars() 11762 1726853252.59800: Calling all_inventory to load vars for managed_node2 11762 1726853252.59803: Calling groups_inventory to load vars for managed_node2 11762 1726853252.59805: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.59814: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.59817: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.59819: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.59986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.60101: done with get_vars() 11762 1726853252.60108: done getting variables 11762 1726853252.60150: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:27:32 -0400 (0:00:00.015) 0:00:03.032 ****** 11762 1726853252.60169: entering _queue_task() for managed_node2/command 11762 1726853252.60374: worker is 1 (out of 1 available) 11762 1726853252.60387: exiting _queue_task() for managed_node2/command 11762 1726853252.60399: done queuing things up, now waiting for results queue to drain 11762 1726853252.60401: waiting for pending results... 11762 1726853252.60569: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 11762 1726853252.60631: in run() - task 02083763-bbaf-d845-03d0-000000000049 11762 1726853252.60645: variable 'ansible_search_path' from source: unknown 11762 1726853252.60648: variable 'ansible_search_path' from source: unknown 11762 1726853252.60680: calling self._execute() 11762 1726853252.60731: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.60735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.60745: variable 'omit' from source: magic vars 11762 1726853252.61027: variable 'ansible_distribution' from source: facts 11762 1726853252.61037: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11762 1726853252.61129: variable 'ansible_distribution_major_version' from source: facts 11762 1726853252.61133: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11762 1726853252.61136: when evaluation is False, skipping this task 11762 1726853252.61140: _execute() done 11762 1726853252.61142: dumping result to json 11762 1726853252.61144: done dumping result, returning 11762 1726853252.61154: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [02083763-bbaf-d845-03d0-000000000049] 11762 1726853252.61158: sending task result for task 02083763-bbaf-d845-03d0-000000000049 11762 1726853252.61241: done sending task result for task 02083763-bbaf-d845-03d0-000000000049 11762 1726853252.61243: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11762 1726853252.61381: no more pending results, returning what we have 11762 1726853252.61384: results queue empty 11762 1726853252.61385: checking for any_errors_fatal 11762 1726853252.61389: done checking for any_errors_fatal 11762 1726853252.61390: checking for max_fail_percentage 11762 1726853252.61391: done checking for max_fail_percentage 11762 1726853252.61392: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.61393: done checking to see if all hosts have failed 11762 1726853252.61393: getting the remaining hosts for this loop 11762 1726853252.61395: done getting the remaining hosts for this loop 11762 1726853252.61397: getting the next task for host managed_node2 11762 1726853252.61405: done getting next task for host managed_node2 11762 1726853252.61407: ^ task is: TASK: Enable EPEL 6 11762 1726853252.61410: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.61414: getting variables 11762 1726853252.61415: in VariableManager get_vars() 11762 1726853252.61442: Calling all_inventory to load vars for managed_node2 11762 1726853252.61444: Calling groups_inventory to load vars for managed_node2 11762 1726853252.61447: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.61455: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.61458: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.61460: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.61636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.61835: done with get_vars() 11762 1726853252.61846: done getting variables 11762 1726853252.61920: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:27:32 -0400 (0:00:00.017) 0:00:03.049 ****** 11762 1726853252.61949: entering _queue_task() for managed_node2/copy 11762 1726853252.62230: worker is 1 (out of 1 available) 11762 1726853252.62243: exiting _queue_task() for managed_node2/copy 11762 1726853252.62255: done queuing things up, now waiting for results queue to drain 11762 1726853252.62256: waiting for pending results... 11762 1726853252.62589: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 11762 1726853252.62594: in run() - task 02083763-bbaf-d845-03d0-00000000004b 11762 1726853252.62598: variable 'ansible_search_path' from source: unknown 11762 1726853252.62600: variable 'ansible_search_path' from source: unknown 11762 1726853252.62603: calling self._execute() 11762 1726853252.62675: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.62689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.62705: variable 'omit' from source: magic vars 11762 1726853252.63086: variable 'ansible_distribution' from source: facts 11762 1726853252.63133: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11762 1726853252.63429: variable 'ansible_distribution_major_version' from source: facts 11762 1726853252.63433: Evaluated conditional (ansible_distribution_major_version == '6'): False 11762 1726853252.63435: when evaluation is False, skipping this task 11762 1726853252.63437: _execute() done 11762 1726853252.63439: dumping result to json 11762 1726853252.63441: done dumping result, returning 11762 1726853252.63447: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [02083763-bbaf-d845-03d0-00000000004b] 11762 1726853252.63458: sending task result for task 02083763-bbaf-d845-03d0-00000000004b skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11762 1726853252.63611: no more pending results, returning what we have 11762 1726853252.63614: results queue empty 11762 1726853252.63615: checking for any_errors_fatal 11762 1726853252.63620: done checking for any_errors_fatal 11762 1726853252.63621: checking for max_fail_percentage 11762 1726853252.63623: done checking for max_fail_percentage 11762 1726853252.63626: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.63626: done checking to see if all hosts have failed 11762 1726853252.63627: getting the remaining hosts for this loop 11762 1726853252.63629: done getting the remaining hosts for this loop 11762 1726853252.63637: getting the next task for host managed_node2 11762 1726853252.63646: done getting next task for host managed_node2 11762 1726853252.63648: ^ task is: TASK: Set network provider to 'nm' 11762 1726853252.63651: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.63654: getting variables 11762 1726853252.63655: in VariableManager get_vars() 11762 1726853252.63694: Calling all_inventory to load vars for managed_node2 11762 1726853252.63696: Calling groups_inventory to load vars for managed_node2 11762 1726853252.63700: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.63711: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.63714: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.63718: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.63980: done sending task result for task 02083763-bbaf-d845-03d0-00000000004b 11762 1726853252.63983: WORKER PROCESS EXITING 11762 1726853252.64015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.64228: done with get_vars() 11762 1726853252.64237: done getting variables 11762 1726853252.64305: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:13 Friday 20 September 2024 13:27:32 -0400 (0:00:00.023) 0:00:03.073 ****** 11762 1726853252.64340: entering _queue_task() for managed_node2/set_fact 11762 1726853252.64612: worker is 1 (out of 1 available) 11762 1726853252.64627: exiting _queue_task() for managed_node2/set_fact 11762 1726853252.64639: done queuing things up, now waiting for results queue to drain 11762 1726853252.64640: waiting for pending results... 11762 1726853252.64806: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 11762 1726853252.64891: in run() - task 02083763-bbaf-d845-03d0-000000000007 11762 1726853252.64912: variable 'ansible_search_path' from source: unknown 11762 1726853252.64956: calling self._execute() 11762 1726853252.65150: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.65163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.65179: variable 'omit' from source: magic vars 11762 1726853252.65380: variable 'omit' from source: magic vars 11762 1726853252.65383: variable 'omit' from source: magic vars 11762 1726853252.65386: variable 'omit' from source: magic vars 11762 1726853252.65413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853252.65462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853252.65493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853252.65516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853252.65538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853252.65578: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853252.65589: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.65598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.65707: Set connection var ansible_timeout to 10 11762 1726853252.65717: Set connection var ansible_shell_type to sh 11762 1726853252.65730: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853252.65741: Set connection var ansible_shell_executable to /bin/sh 11762 1726853252.65758: Set connection var ansible_pipelining to False 11762 1726853252.65774: Set connection var ansible_connection to ssh 11762 1726853252.65877: variable 'ansible_shell_executable' from source: unknown 11762 1726853252.65881: variable 'ansible_connection' from source: unknown 11762 1726853252.65883: variable 'ansible_module_compression' from source: unknown 11762 1726853252.65885: variable 'ansible_shell_type' from source: unknown 11762 1726853252.65887: variable 'ansible_shell_executable' from source: unknown 11762 1726853252.65890: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.65892: variable 'ansible_pipelining' from source: unknown 11762 1726853252.65894: variable 'ansible_timeout' from source: unknown 11762 1726853252.65896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.66000: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853252.66018: variable 'omit' from source: magic vars 11762 1726853252.66030: starting attempt loop 11762 1726853252.66038: running the handler 11762 1726853252.66059: handler run complete 11762 1726853252.66078: attempt loop complete, returning result 11762 1726853252.66086: _execute() done 11762 1726853252.66093: dumping result to json 11762 1726853252.66101: done dumping result, returning 11762 1726853252.66112: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [02083763-bbaf-d845-03d0-000000000007] 11762 1726853252.66176: sending task result for task 02083763-bbaf-d845-03d0-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11762 1726853252.66329: no more pending results, returning what we have 11762 1726853252.66333: results queue empty 11762 1726853252.66334: checking for any_errors_fatal 11762 1726853252.66341: done checking for any_errors_fatal 11762 1726853252.66342: checking for max_fail_percentage 11762 1726853252.66347: done checking for max_fail_percentage 11762 1726853252.66348: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.66349: done checking to see if all hosts have failed 11762 1726853252.66350: getting the remaining hosts for this loop 11762 1726853252.66352: done getting the remaining hosts for this loop 11762 1726853252.66355: getting the next task for host managed_node2 11762 1726853252.66362: done getting next task for host managed_node2 11762 1726853252.66364: ^ task is: TASK: meta (flush_handlers) 11762 1726853252.66366: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.66373: getting variables 11762 1726853252.66374: in VariableManager get_vars() 11762 1726853252.66405: Calling all_inventory to load vars for managed_node2 11762 1726853252.66408: Calling groups_inventory to load vars for managed_node2 11762 1726853252.66412: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.66423: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.66427: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.66430: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.66763: done sending task result for task 02083763-bbaf-d845-03d0-000000000007 11762 1726853252.66766: WORKER PROCESS EXITING 11762 1726853252.66777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.66888: done with get_vars() 11762 1726853252.66895: done getting variables 11762 1726853252.66939: in VariableManager get_vars() 11762 1726853252.66949: Calling all_inventory to load vars for managed_node2 11762 1726853252.66950: Calling groups_inventory to load vars for managed_node2 11762 1726853252.66952: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.66955: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.66956: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.66958: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.67061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.67194: done with get_vars() 11762 1726853252.67205: done queuing things up, now waiting for results queue to drain 11762 1726853252.67207: results queue empty 11762 1726853252.67208: checking for any_errors_fatal 11762 1726853252.67210: done checking for any_errors_fatal 11762 1726853252.67210: checking for max_fail_percentage 11762 1726853252.67211: done checking for max_fail_percentage 11762 1726853252.67212: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.67213: done checking to see if all hosts have failed 11762 1726853252.67213: getting the remaining hosts for this loop 11762 1726853252.67214: done getting the remaining hosts for this loop 11762 1726853252.67217: getting the next task for host managed_node2 11762 1726853252.67220: done getting next task for host managed_node2 11762 1726853252.67221: ^ task is: TASK: meta (flush_handlers) 11762 1726853252.67222: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.67229: getting variables 11762 1726853252.67230: in VariableManager get_vars() 11762 1726853252.67237: Calling all_inventory to load vars for managed_node2 11762 1726853252.67239: Calling groups_inventory to load vars for managed_node2 11762 1726853252.67241: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.67248: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.67250: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.67253: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.67385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.67561: done with get_vars() 11762 1726853252.67569: done getting variables 11762 1726853252.67616: in VariableManager get_vars() 11762 1726853252.67624: Calling all_inventory to load vars for managed_node2 11762 1726853252.67626: Calling groups_inventory to load vars for managed_node2 11762 1726853252.67628: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.67632: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.67634: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.67636: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.67766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.67974: done with get_vars() 11762 1726853252.67986: done queuing things up, now waiting for results queue to drain 11762 1726853252.67988: results queue empty 11762 1726853252.67989: checking for any_errors_fatal 11762 1726853252.67990: done checking for any_errors_fatal 11762 1726853252.67990: checking for max_fail_percentage 11762 1726853252.67991: done checking for max_fail_percentage 11762 1726853252.67992: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.67992: done checking to see if all hosts have failed 11762 1726853252.67993: getting the remaining hosts for this loop 11762 1726853252.67994: done getting the remaining hosts for this loop 11762 1726853252.67996: getting the next task for host managed_node2 11762 1726853252.67999: done getting next task for host managed_node2 11762 1726853252.67999: ^ task is: None 11762 1726853252.68001: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.68002: done queuing things up, now waiting for results queue to drain 11762 1726853252.68002: results queue empty 11762 1726853252.68003: checking for any_errors_fatal 11762 1726853252.68004: done checking for any_errors_fatal 11762 1726853252.68004: checking for max_fail_percentage 11762 1726853252.68005: done checking for max_fail_percentage 11762 1726853252.68006: checking to see if all hosts have failed and the running result is not ok 11762 1726853252.68007: done checking to see if all hosts have failed 11762 1726853252.68008: getting the next task for host managed_node2 11762 1726853252.68011: done getting next task for host managed_node2 11762 1726853252.68011: ^ task is: None 11762 1726853252.68013: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.68066: in VariableManager get_vars() 11762 1726853252.68083: done with get_vars() 11762 1726853252.68090: in VariableManager get_vars() 11762 1726853252.68100: done with get_vars() 11762 1726853252.68104: variable 'omit' from source: magic vars 11762 1726853252.68134: in VariableManager get_vars() 11762 1726853252.68147: done with get_vars() 11762 1726853252.68167: variable 'omit' from source: magic vars PLAY [Play for testing bond options] ******************************************* 11762 1726853252.68375: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11762 1726853252.68399: getting the remaining hosts for this loop 11762 1726853252.68400: done getting the remaining hosts for this loop 11762 1726853252.68402: getting the next task for host managed_node2 11762 1726853252.68404: done getting next task for host managed_node2 11762 1726853252.68405: ^ task is: TASK: Gathering Facts 11762 1726853252.68406: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853252.68407: getting variables 11762 1726853252.68408: in VariableManager get_vars() 11762 1726853252.68413: Calling all_inventory to load vars for managed_node2 11762 1726853252.68414: Calling groups_inventory to load vars for managed_node2 11762 1726853252.68415: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853252.68418: Calling all_plugins_play to load vars for managed_node2 11762 1726853252.68428: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853252.68430: Calling groups_plugins_play to load vars for managed_node2 11762 1726853252.68515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853252.68622: done with get_vars() 11762 1726853252.68628: done getting variables 11762 1726853252.68656: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 Friday 20 September 2024 13:27:32 -0400 (0:00:00.043) 0:00:03.117 ****** 11762 1726853252.68674: entering _queue_task() for managed_node2/gather_facts 11762 1726853252.68878: worker is 1 (out of 1 available) 11762 1726853252.68890: exiting _queue_task() for managed_node2/gather_facts 11762 1726853252.68902: done queuing things up, now waiting for results queue to drain 11762 1726853252.68904: waiting for pending results... 11762 1726853252.69049: running TaskExecutor() for managed_node2/TASK: Gathering Facts 11762 1726853252.69102: in run() - task 02083763-bbaf-d845-03d0-000000000071 11762 1726853252.69113: variable 'ansible_search_path' from source: unknown 11762 1726853252.69143: calling self._execute() 11762 1726853252.69198: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.69202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.69210: variable 'omit' from source: magic vars 11762 1726853252.69549: variable 'ansible_distribution_major_version' from source: facts 11762 1726853252.69558: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853252.69562: variable 'omit' from source: magic vars 11762 1726853252.69581: variable 'omit' from source: magic vars 11762 1726853252.69607: variable 'omit' from source: magic vars 11762 1726853252.69636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853252.69666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853252.69682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853252.69695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853252.69725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853252.69751: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853252.69755: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.69757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.69820: Set connection var ansible_timeout to 10 11762 1726853252.69825: Set connection var ansible_shell_type to sh 11762 1726853252.69827: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853252.69833: Set connection var ansible_shell_executable to /bin/sh 11762 1726853252.69839: Set connection var ansible_pipelining to False 11762 1726853252.69848: Set connection var ansible_connection to ssh 11762 1726853252.69863: variable 'ansible_shell_executable' from source: unknown 11762 1726853252.69867: variable 'ansible_connection' from source: unknown 11762 1726853252.69870: variable 'ansible_module_compression' from source: unknown 11762 1726853252.69874: variable 'ansible_shell_type' from source: unknown 11762 1726853252.69876: variable 'ansible_shell_executable' from source: unknown 11762 1726853252.69878: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853252.69880: variable 'ansible_pipelining' from source: unknown 11762 1726853252.69882: variable 'ansible_timeout' from source: unknown 11762 1726853252.69890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853252.70016: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853252.70025: variable 'omit' from source: magic vars 11762 1726853252.70031: starting attempt loop 11762 1726853252.70034: running the handler 11762 1726853252.70049: variable 'ansible_facts' from source: unknown 11762 1726853252.70063: _low_level_execute_command(): starting 11762 1726853252.70069: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853252.70587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853252.70590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853252.70593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853252.70596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853252.70598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853252.70654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853252.70657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853252.70662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853252.70737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853252.72456: stdout chunk (state=3): >>>/root <<< 11762 1726853252.72603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853252.72606: stdout chunk (state=3): >>><<< 11762 1726853252.72610: stderr chunk (state=3): >>><<< 11762 1726853252.72727: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853252.72730: _low_level_execute_command(): starting 11762 1726853252.72733: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607 `" && echo ansible-tmp-1726853252.7264-11930-158485177756607="` echo /root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607 `" ) && sleep 0' 11762 1726853252.73326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853252.73341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853252.73374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853252.73484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853252.73527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853252.73632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853252.75759: stdout chunk (state=3): >>>ansible-tmp-1726853252.7264-11930-158485177756607=/root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607 <<< 11762 1726853252.75764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853252.76277: stderr chunk (state=3): >>><<< 11762 1726853252.76281: stdout chunk (state=3): >>><<< 11762 1726853252.76284: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853252.7264-11930-158485177756607=/root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853252.76288: variable 'ansible_module_compression' from source: unknown 11762 1726853252.76290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11762 1726853252.76293: variable 'ansible_facts' from source: unknown 11762 1726853252.77076: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607/AnsiballZ_setup.py 11762 1726853252.77602: Sending initial data 11762 1726853252.77606: Sent initial data (151 bytes) 11762 1726853252.78656: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853252.78660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853252.78662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853252.78665: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853252.79098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853252.79177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853252.79585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853252.81190: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853252.81318: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853252.81360: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpjw2mpq08 /root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607/AnsiballZ_setup.py <<< 11762 1726853252.81426: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607/AnsiballZ_setup.py" debug1: stat remote: No such file or directory <<< 11762 1726853252.81442: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpjw2mpq08" to remote "/root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607/AnsiballZ_setup.py" <<< 11762 1726853252.84282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853252.84293: stdout chunk (state=3): >>><<< 11762 1726853252.84304: stderr chunk (state=3): >>><<< 11762 1726853252.84331: done transferring module to remote 11762 1726853252.84361: _low_level_execute_command(): starting 11762 1726853252.84373: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607/ /root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607/AnsiballZ_setup.py && sleep 0' 11762 1726853252.85506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853252.85520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853252.85540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853252.85588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853252.85675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853252.85702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853252.85723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853252.85737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853252.85841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853252.87762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853252.87818: stdout chunk (state=3): >>><<< 11762 1726853252.87837: stderr chunk (state=3): >>><<< 11762 1726853252.87861: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853252.87870: _low_level_execute_command(): starting 11762 1726853252.88153: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607/AnsiballZ_setup.py && sleep 0' 11762 1726853252.89197: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853252.89461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853252.89527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853252.89554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853252.89690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853253.53553: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "33", "epoch": "1726853253", "epoch_int": "1726853253", "date": "2024-09-20", "time": "13:27:33", "iso8601_micro": "2024-09-20T17:27:33.171031Z", "iso8601": "2024-09-20T17:27:33Z", "iso8601_basic": "20240920T132733171031", "iso8601_basic_short": "20240920T132733", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.65185546875, "5m": 0.38134765625, "15m": 0.1826171875}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_system_capabilities_enforced"<<< 11762 1726853253.53711: stdout chunk (state=3): >>>: "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_uuid": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 463, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794037760, "block_size": 4096, "block_total": 65519099, "block_available": 63914560, "block_used": 1604539, "inode_total": 131070960, "inode_available": 131029089, "inode_used": 41871, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4"<<< 11762 1726853253.53750: stdout chunk (state=3): >>>: {"address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10bc:daff:fe29:a445", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.197"], "ansible_all_ipv6_addresses": ["fe80::10bc:daff:fe29:a445"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.197", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10bc:daff:fe29:a445"]}, "ansible_lsb": {}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11762 1726853253.55774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853253.55778: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 11762 1726853253.55989: stderr chunk (state=3): >>><<< 11762 1726853253.55992: stdout chunk (state=3): >>><<< 11762 1726853253.55997: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "33", "epoch": "1726853253", "epoch_int": "1726853253", "date": "2024-09-20", "time": "13:27:33", "iso8601_micro": "2024-09-20T17:27:33.171031Z", "iso8601": "2024-09-20T17:27:33Z", "iso8601_basic": "20240920T132733171031", "iso8601_basic_short": "20240920T132733", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.65185546875, "5m": 0.38134765625, "15m": 0.1826171875}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_uuid": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 463, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794037760, "block_size": 4096, "block_total": 65519099, "block_available": 63914560, "block_used": 1604539, "inode_total": 131070960, "inode_available": 131029089, "inode_used": 41871, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10bc:daff:fe29:a445", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.197"], "ansible_all_ipv6_addresses": ["fe80::10bc:daff:fe29:a445"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.197", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10bc:daff:fe29:a445"]}, "ansible_lsb": {}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853253.56716: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853253.56833: _low_level_execute_command(): starting 11762 1726853253.56836: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853252.7264-11930-158485177756607/ > /dev/null 2>&1 && sleep 0' 11762 1726853253.57689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853253.57703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853253.57737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853253.57754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853253.57789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853253.57800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853253.57833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853253.57902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853253.57940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853253.57943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853253.58159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853253.59992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853253.60016: stderr chunk (state=3): >>><<< 11762 1726853253.60026: stdout chunk (state=3): >>><<< 11762 1726853253.60051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853253.60065: handler run complete 11762 1726853253.60258: variable 'ansible_facts' from source: unknown 11762 1726853253.60497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.61077: variable 'ansible_facts' from source: unknown 11762 1726853253.61184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.61351: attempt loop complete, returning result 11762 1726853253.61383: _execute() done 11762 1726853253.61386: dumping result to json 11762 1726853253.61412: done dumping result, returning 11762 1726853253.61477: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [02083763-bbaf-d845-03d0-000000000071] 11762 1726853253.61485: sending task result for task 02083763-bbaf-d845-03d0-000000000071 ok: [managed_node2] 11762 1726853253.62624: no more pending results, returning what we have 11762 1726853253.62627: results queue empty 11762 1726853253.62628: checking for any_errors_fatal 11762 1726853253.62629: done checking for any_errors_fatal 11762 1726853253.62629: checking for max_fail_percentage 11762 1726853253.62631: done checking for max_fail_percentage 11762 1726853253.62632: checking to see if all hosts have failed and the running result is not ok 11762 1726853253.62632: done checking to see if all hosts have failed 11762 1726853253.62633: getting the remaining hosts for this loop 11762 1726853253.62634: done getting the remaining hosts for this loop 11762 1726853253.62637: getting the next task for host managed_node2 11762 1726853253.62642: done getting next task for host managed_node2 11762 1726853253.62646: ^ task is: TASK: meta (flush_handlers) 11762 1726853253.62648: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853253.62651: getting variables 11762 1726853253.62653: in VariableManager get_vars() 11762 1726853253.62675: Calling all_inventory to load vars for managed_node2 11762 1726853253.62677: Calling groups_inventory to load vars for managed_node2 11762 1726853253.62680: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.62686: done sending task result for task 02083763-bbaf-d845-03d0-000000000071 11762 1726853253.62693: WORKER PROCESS EXITING 11762 1726853253.62702: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.62704: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.62707: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.62865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.63063: done with get_vars() 11762 1726853253.63076: done getting variables 11762 1726853253.63178: in VariableManager get_vars() 11762 1726853253.63187: Calling all_inventory to load vars for managed_node2 11762 1726853253.63191: Calling groups_inventory to load vars for managed_node2 11762 1726853253.63194: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.63198: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.63205: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.63209: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.63295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.63412: done with get_vars() 11762 1726853253.63421: done queuing things up, now waiting for results queue to drain 11762 1726853253.63423: results queue empty 11762 1726853253.63423: checking for any_errors_fatal 11762 1726853253.63429: done checking for any_errors_fatal 11762 1726853253.63430: checking for max_fail_percentage 11762 1726853253.63430: done checking for max_fail_percentage 11762 1726853253.63431: checking to see if all hosts have failed and the running result is not ok 11762 1726853253.63431: done checking to see if all hosts have failed 11762 1726853253.63432: getting the remaining hosts for this loop 11762 1726853253.63432: done getting the remaining hosts for this loop 11762 1726853253.63434: getting the next task for host managed_node2 11762 1726853253.63437: done getting next task for host managed_node2 11762 1726853253.63438: ^ task is: TASK: Show playbook name 11762 1726853253.63440: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853253.63443: getting variables 11762 1726853253.63446: in VariableManager get_vars() 11762 1726853253.63454: Calling all_inventory to load vars for managed_node2 11762 1726853253.63455: Calling groups_inventory to load vars for managed_node2 11762 1726853253.63457: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.63460: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.63461: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.63463: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.63539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.63645: done with get_vars() 11762 1726853253.63651: done getting variables 11762 1726853253.63706: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:32 Friday 20 September 2024 13:27:33 -0400 (0:00:00.950) 0:00:04.067 ****** 11762 1726853253.63724: entering _queue_task() for managed_node2/debug 11762 1726853253.63726: Creating lock for debug 11762 1726853253.63952: worker is 1 (out of 1 available) 11762 1726853253.63966: exiting _queue_task() for managed_node2/debug 11762 1726853253.63979: done queuing things up, now waiting for results queue to drain 11762 1726853253.63981: waiting for pending results... 11762 1726853253.64131: running TaskExecutor() for managed_node2/TASK: Show playbook name 11762 1726853253.64185: in run() - task 02083763-bbaf-d845-03d0-00000000000b 11762 1726853253.64196: variable 'ansible_search_path' from source: unknown 11762 1726853253.64229: calling self._execute() 11762 1726853253.64281: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.64286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.64294: variable 'omit' from source: magic vars 11762 1726853253.64780: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.64784: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.64786: variable 'omit' from source: magic vars 11762 1726853253.64789: variable 'omit' from source: magic vars 11762 1726853253.64791: variable 'omit' from source: magic vars 11762 1726853253.64819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853253.64862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853253.64896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853253.64917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.64933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.64970: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853253.64981: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.64993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.65126: Set connection var ansible_timeout to 10 11762 1726853253.65135: Set connection var ansible_shell_type to sh 11762 1726853253.65150: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853253.65161: Set connection var ansible_shell_executable to /bin/sh 11762 1726853253.65176: Set connection var ansible_pipelining to False 11762 1726853253.65215: Set connection var ansible_connection to ssh 11762 1726853253.65222: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.65230: variable 'ansible_connection' from source: unknown 11762 1726853253.65236: variable 'ansible_module_compression' from source: unknown 11762 1726853253.65243: variable 'ansible_shell_type' from source: unknown 11762 1726853253.65325: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.65328: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.65330: variable 'ansible_pipelining' from source: unknown 11762 1726853253.65333: variable 'ansible_timeout' from source: unknown 11762 1726853253.65334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.65655: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853253.65678: variable 'omit' from source: magic vars 11762 1726853253.65705: starting attempt loop 11762 1726853253.65713: running the handler 11762 1726853253.65775: handler run complete 11762 1726853253.65804: attempt loop complete, returning result 11762 1726853253.65812: _execute() done 11762 1726853253.65819: dumping result to json 11762 1726853253.65828: done dumping result, returning 11762 1726853253.65840: done running TaskExecutor() for managed_node2/TASK: Show playbook name [02083763-bbaf-d845-03d0-00000000000b] 11762 1726853253.65854: sending task result for task 02083763-bbaf-d845-03d0-00000000000b ok: [managed_node2] => {} MSG: this is: playbooks/tests_bond_options.yml 11762 1726853253.66023: no more pending results, returning what we have 11762 1726853253.66028: results queue empty 11762 1726853253.66028: checking for any_errors_fatal 11762 1726853253.66030: done checking for any_errors_fatal 11762 1726853253.66031: checking for max_fail_percentage 11762 1726853253.66033: done checking for max_fail_percentage 11762 1726853253.66034: checking to see if all hosts have failed and the running result is not ok 11762 1726853253.66034: done checking to see if all hosts have failed 11762 1726853253.66035: getting the remaining hosts for this loop 11762 1726853253.66038: done getting the remaining hosts for this loop 11762 1726853253.66041: getting the next task for host managed_node2 11762 1726853253.66050: done getting next task for host managed_node2 11762 1726853253.66052: ^ task is: TASK: Include the task 'run_test.yml' 11762 1726853253.66055: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853253.66058: getting variables 11762 1726853253.66060: in VariableManager get_vars() 11762 1726853253.66207: Calling all_inventory to load vars for managed_node2 11762 1726853253.66210: Calling groups_inventory to load vars for managed_node2 11762 1726853253.66214: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.66224: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.66228: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.66231: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.66641: done sending task result for task 02083763-bbaf-d845-03d0-00000000000b 11762 1726853253.66647: WORKER PROCESS EXITING 11762 1726853253.66672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.66896: done with get_vars() 11762 1726853253.66905: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:42 Friday 20 September 2024 13:27:33 -0400 (0:00:00.034) 0:00:04.102 ****** 11762 1726853253.67220: entering _queue_task() for managed_node2/include_tasks 11762 1726853253.67647: worker is 1 (out of 1 available) 11762 1726853253.67660: exiting _queue_task() for managed_node2/include_tasks 11762 1726853253.67776: done queuing things up, now waiting for results queue to drain 11762 1726853253.67779: waiting for pending results... 11762 1726853253.68106: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 11762 1726853253.68181: in run() - task 02083763-bbaf-d845-03d0-00000000000d 11762 1726853253.68310: variable 'ansible_search_path' from source: unknown 11762 1726853253.68348: calling self._execute() 11762 1726853253.68528: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.68534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.68549: variable 'omit' from source: magic vars 11762 1726853253.69466: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.69492: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.69504: _execute() done 11762 1726853253.69524: dumping result to json 11762 1726853253.69528: done dumping result, returning 11762 1726853253.69577: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [02083763-bbaf-d845-03d0-00000000000d] 11762 1726853253.69581: sending task result for task 02083763-bbaf-d845-03d0-00000000000d 11762 1726853253.69703: no more pending results, returning what we have 11762 1726853253.69708: in VariableManager get_vars() 11762 1726853253.69742: Calling all_inventory to load vars for managed_node2 11762 1726853253.69746: Calling groups_inventory to load vars for managed_node2 11762 1726853253.69750: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.69761: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.69764: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.69767: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.70256: done sending task result for task 02083763-bbaf-d845-03d0-00000000000d 11762 1726853253.70260: WORKER PROCESS EXITING 11762 1726853253.70519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.70849: done with get_vars() 11762 1726853253.70858: variable 'ansible_search_path' from source: unknown 11762 1726853253.70940: we have included files to process 11762 1726853253.70941: generating all_blocks data 11762 1726853253.70943: done generating all_blocks data 11762 1726853253.70946: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11762 1726853253.70947: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11762 1726853253.70951: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11762 1726853253.72149: in VariableManager get_vars() 11762 1726853253.72280: done with get_vars() 11762 1726853253.72318: in VariableManager get_vars() 11762 1726853253.72333: done with get_vars() 11762 1726853253.72477: in VariableManager get_vars() 11762 1726853253.72497: done with get_vars() 11762 1726853253.72534: in VariableManager get_vars() 11762 1726853253.72552: done with get_vars() 11762 1726853253.72795: in VariableManager get_vars() 11762 1726853253.72815: done with get_vars() 11762 1726853253.73604: in VariableManager get_vars() 11762 1726853253.73619: done with get_vars() 11762 1726853253.73631: done processing included file 11762 1726853253.73633: iterating over new_blocks loaded from include file 11762 1726853253.73634: in VariableManager get_vars() 11762 1726853253.73647: done with get_vars() 11762 1726853253.73649: filtering new block on tags 11762 1726853253.73996: done filtering new block on tags 11762 1726853253.74000: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 11762 1726853253.74005: extending task lists for all hosts with included blocks 11762 1726853253.74048: done extending task lists 11762 1726853253.74050: done processing included files 11762 1726853253.74050: results queue empty 11762 1726853253.74051: checking for any_errors_fatal 11762 1726853253.74055: done checking for any_errors_fatal 11762 1726853253.74056: checking for max_fail_percentage 11762 1726853253.74058: done checking for max_fail_percentage 11762 1726853253.74058: checking to see if all hosts have failed and the running result is not ok 11762 1726853253.74059: done checking to see if all hosts have failed 11762 1726853253.74060: getting the remaining hosts for this loop 11762 1726853253.74061: done getting the remaining hosts for this loop 11762 1726853253.74063: getting the next task for host managed_node2 11762 1726853253.74067: done getting next task for host managed_node2 11762 1726853253.74069: ^ task is: TASK: TEST: {{ lsr_description }} 11762 1726853253.74176: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853253.74180: getting variables 11762 1726853253.74181: in VariableManager get_vars() 11762 1726853253.74189: Calling all_inventory to load vars for managed_node2 11762 1726853253.74191: Calling groups_inventory to load vars for managed_node2 11762 1726853253.74193: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.74198: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.74200: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.74202: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.74608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.75048: done with get_vars() 11762 1726853253.75057: done getting variables 11762 1726853253.75103: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853253.75408: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 13:27:33 -0400 (0:00:00.082) 0:00:04.184 ****** 11762 1726853253.75453: entering _queue_task() for managed_node2/debug 11762 1726853253.75941: worker is 1 (out of 1 available) 11762 1726853253.75955: exiting _queue_task() for managed_node2/debug 11762 1726853253.75966: done queuing things up, now waiting for results queue to drain 11762 1726853253.75968: waiting for pending results... 11762 1726853253.76235: running TaskExecutor() for managed_node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 11762 1726853253.76341: in run() - task 02083763-bbaf-d845-03d0-000000000088 11762 1726853253.76365: variable 'ansible_search_path' from source: unknown 11762 1726853253.76375: variable 'ansible_search_path' from source: unknown 11762 1726853253.76415: calling self._execute() 11762 1726853253.76498: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.76509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.76522: variable 'omit' from source: magic vars 11762 1726853253.76888: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.76905: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.76917: variable 'omit' from source: magic vars 11762 1726853253.76961: variable 'omit' from source: magic vars 11762 1726853253.77066: variable 'lsr_description' from source: include params 11762 1726853253.77096: variable 'omit' from source: magic vars 11762 1726853253.77138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853253.77195: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853253.77221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853253.77297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.77299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.77302: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853253.77307: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.77315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.77427: Set connection var ansible_timeout to 10 11762 1726853253.77436: Set connection var ansible_shell_type to sh 11762 1726853253.77450: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853253.77462: Set connection var ansible_shell_executable to /bin/sh 11762 1726853253.77477: Set connection var ansible_pipelining to False 11762 1726853253.77489: Set connection var ansible_connection to ssh 11762 1726853253.77523: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.77575: variable 'ansible_connection' from source: unknown 11762 1726853253.77579: variable 'ansible_module_compression' from source: unknown 11762 1726853253.77581: variable 'ansible_shell_type' from source: unknown 11762 1726853253.77583: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.77586: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.77588: variable 'ansible_pipelining' from source: unknown 11762 1726853253.77590: variable 'ansible_timeout' from source: unknown 11762 1726853253.77591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.77719: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853253.77742: variable 'omit' from source: magic vars 11762 1726853253.77757: starting attempt loop 11762 1726853253.77765: running the handler 11762 1726853253.77841: handler run complete 11762 1726853253.77848: attempt loop complete, returning result 11762 1726853253.77850: _execute() done 11762 1726853253.77852: dumping result to json 11762 1726853253.77854: done dumping result, returning 11762 1726853253.77863: done running TaskExecutor() for managed_node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [02083763-bbaf-d845-03d0-000000000088] 11762 1726853253.77875: sending task result for task 02083763-bbaf-d845-03d0-000000000088 11762 1726853253.78017: done sending task result for task 02083763-bbaf-d845-03d0-000000000088 11762 1726853253.78021: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 11762 1726853253.78105: no more pending results, returning what we have 11762 1726853253.78108: results queue empty 11762 1726853253.78109: checking for any_errors_fatal 11762 1726853253.78111: done checking for any_errors_fatal 11762 1726853253.78111: checking for max_fail_percentage 11762 1726853253.78113: done checking for max_fail_percentage 11762 1726853253.78114: checking to see if all hosts have failed and the running result is not ok 11762 1726853253.78115: done checking to see if all hosts have failed 11762 1726853253.78115: getting the remaining hosts for this loop 11762 1726853253.78117: done getting the remaining hosts for this loop 11762 1726853253.78121: getting the next task for host managed_node2 11762 1726853253.78127: done getting next task for host managed_node2 11762 1726853253.78130: ^ task is: TASK: Show item 11762 1726853253.78132: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853253.78136: getting variables 11762 1726853253.78137: in VariableManager get_vars() 11762 1726853253.78169: Calling all_inventory to load vars for managed_node2 11762 1726853253.78174: Calling groups_inventory to load vars for managed_node2 11762 1726853253.78179: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.78190: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.78193: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.78196: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.78486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.78801: done with get_vars() 11762 1726853253.78811: done getting variables 11762 1726853253.78875: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 13:27:33 -0400 (0:00:00.034) 0:00:04.219 ****** 11762 1726853253.78907: entering _queue_task() for managed_node2/debug 11762 1726853253.79277: worker is 1 (out of 1 available) 11762 1726853253.79289: exiting _queue_task() for managed_node2/debug 11762 1726853253.79300: done queuing things up, now waiting for results queue to drain 11762 1726853253.79302: waiting for pending results... 11762 1726853253.79592: running TaskExecutor() for managed_node2/TASK: Show item 11762 1726853253.79596: in run() - task 02083763-bbaf-d845-03d0-000000000089 11762 1726853253.79599: variable 'ansible_search_path' from source: unknown 11762 1726853253.79602: variable 'ansible_search_path' from source: unknown 11762 1726853253.79638: variable 'omit' from source: magic vars 11762 1726853253.79764: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.79780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.79795: variable 'omit' from source: magic vars 11762 1726853253.80452: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.80561: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.80564: variable 'omit' from source: magic vars 11762 1726853253.80567: variable 'omit' from source: magic vars 11762 1726853253.80573: variable 'item' from source: unknown 11762 1726853253.80655: variable 'item' from source: unknown 11762 1726853253.80692: variable 'omit' from source: magic vars 11762 1726853253.80734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853253.80781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853253.80808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853253.80885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.80888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.80891: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853253.80893: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.80896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.81011: Set connection var ansible_timeout to 10 11762 1726853253.81025: Set connection var ansible_shell_type to sh 11762 1726853253.81037: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853253.81051: Set connection var ansible_shell_executable to /bin/sh 11762 1726853253.81065: Set connection var ansible_pipelining to False 11762 1726853253.81080: Set connection var ansible_connection to ssh 11762 1726853253.81111: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.81132: variable 'ansible_connection' from source: unknown 11762 1726853253.81135: variable 'ansible_module_compression' from source: unknown 11762 1726853253.81138: variable 'ansible_shell_type' from source: unknown 11762 1726853253.81210: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.81214: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.81216: variable 'ansible_pipelining' from source: unknown 11762 1726853253.81218: variable 'ansible_timeout' from source: unknown 11762 1726853253.81220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.81323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853253.81339: variable 'omit' from source: magic vars 11762 1726853253.81357: starting attempt loop 11762 1726853253.81365: running the handler 11762 1726853253.81414: variable 'lsr_description' from source: include params 11762 1726853253.81493: variable 'lsr_description' from source: include params 11762 1726853253.81506: handler run complete 11762 1726853253.81527: attempt loop complete, returning result 11762 1726853253.81553: variable 'item' from source: unknown 11762 1726853253.81620: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 11762 1726853253.81907: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.81910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.81913: variable 'omit' from source: magic vars 11762 1726853253.82078: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.82081: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.82084: variable 'omit' from source: magic vars 11762 1726853253.82086: variable 'omit' from source: magic vars 11762 1726853253.82115: variable 'item' from source: unknown 11762 1726853253.82188: variable 'item' from source: unknown 11762 1726853253.82208: variable 'omit' from source: magic vars 11762 1726853253.82236: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853253.82253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.82263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.82295: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853253.82298: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.82300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.82404: Set connection var ansible_timeout to 10 11762 1726853253.82407: Set connection var ansible_shell_type to sh 11762 1726853253.82409: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853253.82411: Set connection var ansible_shell_executable to /bin/sh 11762 1726853253.82413: Set connection var ansible_pipelining to False 11762 1726853253.82423: Set connection var ansible_connection to ssh 11762 1726853253.82452: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.82461: variable 'ansible_connection' from source: unknown 11762 1726853253.82512: variable 'ansible_module_compression' from source: unknown 11762 1726853253.82515: variable 'ansible_shell_type' from source: unknown 11762 1726853253.82517: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.82519: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.82521: variable 'ansible_pipelining' from source: unknown 11762 1726853253.82523: variable 'ansible_timeout' from source: unknown 11762 1726853253.82525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.82601: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853253.82621: variable 'omit' from source: magic vars 11762 1726853253.82632: starting attempt loop 11762 1726853253.82640: running the handler 11762 1726853253.82730: variable 'lsr_setup' from source: include params 11762 1726853253.82755: variable 'lsr_setup' from source: include params 11762 1726853253.82809: handler run complete 11762 1726853253.82841: attempt loop complete, returning result 11762 1726853253.82865: variable 'item' from source: unknown 11762 1726853253.82937: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 11762 1726853253.83276: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.83279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.83282: variable 'omit' from source: magic vars 11762 1726853253.83292: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.83302: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.83309: variable 'omit' from source: magic vars 11762 1726853253.83326: variable 'omit' from source: magic vars 11762 1726853253.83368: variable 'item' from source: unknown 11762 1726853253.83438: variable 'item' from source: unknown 11762 1726853253.83460: variable 'omit' from source: magic vars 11762 1726853253.83484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853253.83506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.83616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.83619: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853253.83621: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.83624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.83626: Set connection var ansible_timeout to 10 11762 1726853253.83628: Set connection var ansible_shell_type to sh 11762 1726853253.83630: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853253.83639: Set connection var ansible_shell_executable to /bin/sh 11762 1726853253.83654: Set connection var ansible_pipelining to False 11762 1726853253.83664: Set connection var ansible_connection to ssh 11762 1726853253.83687: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.83694: variable 'ansible_connection' from source: unknown 11762 1726853253.83701: variable 'ansible_module_compression' from source: unknown 11762 1726853253.83707: variable 'ansible_shell_type' from source: unknown 11762 1726853253.83712: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.83724: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.83732: variable 'ansible_pipelining' from source: unknown 11762 1726853253.83738: variable 'ansible_timeout' from source: unknown 11762 1726853253.83747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.83848: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853253.83860: variable 'omit' from source: magic vars 11762 1726853253.83868: starting attempt loop 11762 1726853253.83876: running the handler 11762 1726853253.83898: variable 'lsr_test' from source: include params 11762 1726853253.83970: variable 'lsr_test' from source: include params 11762 1726853253.83992: handler run complete 11762 1726853253.84009: attempt loop complete, returning result 11762 1726853253.84052: variable 'item' from source: unknown 11762 1726853253.84098: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile.yml" ] } 11762 1726853253.84307: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.84310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.84312: variable 'omit' from source: magic vars 11762 1726853253.84433: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.84443: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.84455: variable 'omit' from source: magic vars 11762 1726853253.84475: variable 'omit' from source: magic vars 11762 1726853253.84527: variable 'item' from source: unknown 11762 1726853253.84599: variable 'item' from source: unknown 11762 1726853253.84617: variable 'omit' from source: magic vars 11762 1726853253.84703: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853253.84706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.84709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.84711: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853253.84713: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.84715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.84768: Set connection var ansible_timeout to 10 11762 1726853253.84778: Set connection var ansible_shell_type to sh 11762 1726853253.84789: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853253.84798: Set connection var ansible_shell_executable to /bin/sh 11762 1726853253.84813: Set connection var ansible_pipelining to False 11762 1726853253.84823: Set connection var ansible_connection to ssh 11762 1726853253.84847: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.84858: variable 'ansible_connection' from source: unknown 11762 1726853253.84865: variable 'ansible_module_compression' from source: unknown 11762 1726853253.84920: variable 'ansible_shell_type' from source: unknown 11762 1726853253.84923: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.84925: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.84927: variable 'ansible_pipelining' from source: unknown 11762 1726853253.84929: variable 'ansible_timeout' from source: unknown 11762 1726853253.84931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.85003: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853253.85017: variable 'omit' from source: magic vars 11762 1726853253.85032: starting attempt loop 11762 1726853253.85038: running the handler 11762 1726853253.85063: variable 'lsr_assert' from source: include params 11762 1726853253.85129: variable 'lsr_assert' from source: include params 11762 1726853253.85175: handler run complete 11762 1726853253.85185: attempt loop complete, returning result 11762 1726853253.85196: variable 'item' from source: unknown 11762 1726853253.85263: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_controller_device_present.yml", "tasks/assert_bond_port_profile_present.yml", "tasks/assert_bond_options.yml" ] } 11762 1726853253.85341: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.85352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.85361: variable 'omit' from source: magic vars 11762 1726853253.85464: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.85472: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.85475: variable 'omit' from source: magic vars 11762 1726853253.85484: variable 'omit' from source: magic vars 11762 1726853253.85508: variable 'item' from source: unknown 11762 1726853253.85550: variable 'item' from source: unknown 11762 1726853253.85561: variable 'omit' from source: magic vars 11762 1726853253.85580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853253.85587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.85593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.85602: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853253.85605: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.85607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.85652: Set connection var ansible_timeout to 10 11762 1726853253.85655: Set connection var ansible_shell_type to sh 11762 1726853253.85661: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853253.85664: Set connection var ansible_shell_executable to /bin/sh 11762 1726853253.85672: Set connection var ansible_pipelining to False 11762 1726853253.85678: Set connection var ansible_connection to ssh 11762 1726853253.85694: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.85697: variable 'ansible_connection' from source: unknown 11762 1726853253.85699: variable 'ansible_module_compression' from source: unknown 11762 1726853253.85702: variable 'ansible_shell_type' from source: unknown 11762 1726853253.85704: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.85706: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.85710: variable 'ansible_pipelining' from source: unknown 11762 1726853253.85712: variable 'ansible_timeout' from source: unknown 11762 1726853253.85716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.85773: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853253.85780: variable 'omit' from source: magic vars 11762 1726853253.85784: starting attempt loop 11762 1726853253.85788: running the handler 11762 1726853253.85861: handler run complete 11762 1726853253.85870: attempt loop complete, returning result 11762 1726853253.85884: variable 'item' from source: unknown 11762 1726853253.85927: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 11762 1726853253.86253: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.86256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.86258: variable 'omit' from source: magic vars 11762 1726853253.86260: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.86262: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.86264: variable 'omit' from source: magic vars 11762 1726853253.86267: variable 'omit' from source: magic vars 11762 1726853253.86269: variable 'item' from source: unknown 11762 1726853253.86273: variable 'item' from source: unknown 11762 1726853253.86286: variable 'omit' from source: magic vars 11762 1726853253.86297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853253.86303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.86309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.86317: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853253.86320: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.86322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.86367: Set connection var ansible_timeout to 10 11762 1726853253.86370: Set connection var ansible_shell_type to sh 11762 1726853253.86374: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853253.86380: Set connection var ansible_shell_executable to /bin/sh 11762 1726853253.86386: Set connection var ansible_pipelining to False 11762 1726853253.86395: Set connection var ansible_connection to ssh 11762 1726853253.86408: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.86411: variable 'ansible_connection' from source: unknown 11762 1726853253.86413: variable 'ansible_module_compression' from source: unknown 11762 1726853253.86415: variable 'ansible_shell_type' from source: unknown 11762 1726853253.86417: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.86420: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.86422: variable 'ansible_pipelining' from source: unknown 11762 1726853253.86426: variable 'ansible_timeout' from source: unknown 11762 1726853253.86430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.86488: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853253.86494: variable 'omit' from source: magic vars 11762 1726853253.86498: starting attempt loop 11762 1726853253.86501: running the handler 11762 1726853253.86516: variable 'lsr_fail_debug' from source: play vars 11762 1726853253.86560: variable 'lsr_fail_debug' from source: play vars 11762 1726853253.86574: handler run complete 11762 1726853253.86585: attempt loop complete, returning result 11762 1726853253.86595: variable 'item' from source: unknown 11762 1726853253.86638: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 11762 1726853253.86712: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.86715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.86717: variable 'omit' from source: magic vars 11762 1726853253.86814: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.86817: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.86820: variable 'omit' from source: magic vars 11762 1726853253.86837: variable 'omit' from source: magic vars 11762 1726853253.86861: variable 'item' from source: unknown 11762 1726853253.86920: variable 'item' from source: unknown 11762 1726853253.86931: variable 'omit' from source: magic vars 11762 1726853253.86946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853253.86959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.86966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.86968: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853253.87003: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.87006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.87090: Set connection var ansible_timeout to 10 11762 1726853253.87094: Set connection var ansible_shell_type to sh 11762 1726853253.87097: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853253.87099: Set connection var ansible_shell_executable to /bin/sh 11762 1726853253.87101: Set connection var ansible_pipelining to False 11762 1726853253.87103: Set connection var ansible_connection to ssh 11762 1726853253.87106: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.87108: variable 'ansible_connection' from source: unknown 11762 1726853253.87110: variable 'ansible_module_compression' from source: unknown 11762 1726853253.87111: variable 'ansible_shell_type' from source: unknown 11762 1726853253.87113: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.87115: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.87117: variable 'ansible_pipelining' from source: unknown 11762 1726853253.87119: variable 'ansible_timeout' from source: unknown 11762 1726853253.87121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.87189: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853253.87194: variable 'omit' from source: magic vars 11762 1726853253.87376: starting attempt loop 11762 1726853253.87380: running the handler 11762 1726853253.87383: variable 'lsr_cleanup' from source: include params 11762 1726853253.87385: variable 'lsr_cleanup' from source: include params 11762 1726853253.87388: handler run complete 11762 1726853253.87390: attempt loop complete, returning result 11762 1726853253.87392: variable 'item' from source: unknown 11762 1726853253.87434: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml" ] } 11762 1726853253.87535: dumping result to json 11762 1726853253.87554: done dumping result, returning 11762 1726853253.87565: done running TaskExecutor() for managed_node2/TASK: Show item [02083763-bbaf-d845-03d0-000000000089] 11762 1726853253.87577: sending task result for task 02083763-bbaf-d845-03d0-000000000089 11762 1726853253.87712: no more pending results, returning what we have 11762 1726853253.87715: results queue empty 11762 1726853253.87716: checking for any_errors_fatal 11762 1726853253.87721: done checking for any_errors_fatal 11762 1726853253.87721: checking for max_fail_percentage 11762 1726853253.87722: done checking for max_fail_percentage 11762 1726853253.87723: checking to see if all hosts have failed and the running result is not ok 11762 1726853253.87724: done checking to see if all hosts have failed 11762 1726853253.87724: getting the remaining hosts for this loop 11762 1726853253.87726: done getting the remaining hosts for this loop 11762 1726853253.87729: getting the next task for host managed_node2 11762 1726853253.87734: done getting next task for host managed_node2 11762 1726853253.87736: ^ task is: TASK: Include the task 'show_interfaces.yml' 11762 1726853253.87738: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853253.87740: getting variables 11762 1726853253.87741: in VariableManager get_vars() 11762 1726853253.87766: Calling all_inventory to load vars for managed_node2 11762 1726853253.87768: Calling groups_inventory to load vars for managed_node2 11762 1726853253.87773: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.87784: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.87786: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.87789: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.87987: done sending task result for task 02083763-bbaf-d845-03d0-000000000089 11762 1726853253.87991: WORKER PROCESS EXITING 11762 1726853253.88008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.88164: done with get_vars() 11762 1726853253.88193: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 13:27:33 -0400 (0:00:00.093) 0:00:04.313 ****** 11762 1726853253.88287: entering _queue_task() for managed_node2/include_tasks 11762 1726853253.88701: worker is 1 (out of 1 available) 11762 1726853253.88710: exiting _queue_task() for managed_node2/include_tasks 11762 1726853253.88720: done queuing things up, now waiting for results queue to drain 11762 1726853253.88722: waiting for pending results... 11762 1726853253.88855: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 11762 1726853253.88933: in run() - task 02083763-bbaf-d845-03d0-00000000008a 11762 1726853253.89058: variable 'ansible_search_path' from source: unknown 11762 1726853253.89061: variable 'ansible_search_path' from source: unknown 11762 1726853253.89064: calling self._execute() 11762 1726853253.89092: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.89110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.89117: variable 'omit' from source: magic vars 11762 1726853253.89413: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.89423: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.89429: _execute() done 11762 1726853253.89432: dumping result to json 11762 1726853253.89434: done dumping result, returning 11762 1726853253.89441: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-d845-03d0-00000000008a] 11762 1726853253.89452: sending task result for task 02083763-bbaf-d845-03d0-00000000008a 11762 1726853253.89531: done sending task result for task 02083763-bbaf-d845-03d0-00000000008a 11762 1726853253.89535: WORKER PROCESS EXITING 11762 1726853253.89562: no more pending results, returning what we have 11762 1726853253.89567: in VariableManager get_vars() 11762 1726853253.89601: Calling all_inventory to load vars for managed_node2 11762 1726853253.89604: Calling groups_inventory to load vars for managed_node2 11762 1726853253.89607: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.89618: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.89620: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.89623: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.89778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.89891: done with get_vars() 11762 1726853253.89896: variable 'ansible_search_path' from source: unknown 11762 1726853253.89897: variable 'ansible_search_path' from source: unknown 11762 1726853253.89925: we have included files to process 11762 1726853253.89926: generating all_blocks data 11762 1726853253.89928: done generating all_blocks data 11762 1726853253.89930: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11762 1726853253.89931: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11762 1726853253.89933: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11762 1726853253.90031: in VariableManager get_vars() 11762 1726853253.90043: done with get_vars() 11762 1726853253.90118: done processing included file 11762 1726853253.90120: iterating over new_blocks loaded from include file 11762 1726853253.90121: in VariableManager get_vars() 11762 1726853253.90129: done with get_vars() 11762 1726853253.90130: filtering new block on tags 11762 1726853253.90154: done filtering new block on tags 11762 1726853253.90156: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 11762 1726853253.90159: extending task lists for all hosts with included blocks 11762 1726853253.90408: done extending task lists 11762 1726853253.90409: done processing included files 11762 1726853253.90409: results queue empty 11762 1726853253.90410: checking for any_errors_fatal 11762 1726853253.90414: done checking for any_errors_fatal 11762 1726853253.90414: checking for max_fail_percentage 11762 1726853253.90415: done checking for max_fail_percentage 11762 1726853253.90415: checking to see if all hosts have failed and the running result is not ok 11762 1726853253.90416: done checking to see if all hosts have failed 11762 1726853253.90416: getting the remaining hosts for this loop 11762 1726853253.90417: done getting the remaining hosts for this loop 11762 1726853253.90418: getting the next task for host managed_node2 11762 1726853253.90421: done getting next task for host managed_node2 11762 1726853253.90422: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 11762 1726853253.90424: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853253.90426: getting variables 11762 1726853253.90427: in VariableManager get_vars() 11762 1726853253.90432: Calling all_inventory to load vars for managed_node2 11762 1726853253.90433: Calling groups_inventory to load vars for managed_node2 11762 1726853253.90435: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.90439: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.90440: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.90442: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.90542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.90651: done with get_vars() 11762 1726853253.90658: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:27:33 -0400 (0:00:00.024) 0:00:04.337 ****** 11762 1726853253.90709: entering _queue_task() for managed_node2/include_tasks 11762 1726853253.90897: worker is 1 (out of 1 available) 11762 1726853253.90908: exiting _queue_task() for managed_node2/include_tasks 11762 1726853253.90919: done queuing things up, now waiting for results queue to drain 11762 1726853253.90923: waiting for pending results... 11762 1726853253.91188: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 11762 1726853253.91278: in run() - task 02083763-bbaf-d845-03d0-0000000000b1 11762 1726853253.91281: variable 'ansible_search_path' from source: unknown 11762 1726853253.91284: variable 'ansible_search_path' from source: unknown 11762 1726853253.91295: calling self._execute() 11762 1726853253.91375: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.91386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.91407: variable 'omit' from source: magic vars 11762 1726853253.91794: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.91840: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.91849: _execute() done 11762 1726853253.91852: dumping result to json 11762 1726853253.91855: done dumping result, returning 11762 1726853253.91857: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-d845-03d0-0000000000b1] 11762 1726853253.91859: sending task result for task 02083763-bbaf-d845-03d0-0000000000b1 11762 1726853253.91968: done sending task result for task 02083763-bbaf-d845-03d0-0000000000b1 11762 1726853253.91973: WORKER PROCESS EXITING 11762 1726853253.92001: no more pending results, returning what we have 11762 1726853253.92006: in VariableManager get_vars() 11762 1726853253.92037: Calling all_inventory to load vars for managed_node2 11762 1726853253.92040: Calling groups_inventory to load vars for managed_node2 11762 1726853253.92043: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.92057: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.92059: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.92062: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.92196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.92309: done with get_vars() 11762 1726853253.92314: variable 'ansible_search_path' from source: unknown 11762 1726853253.92315: variable 'ansible_search_path' from source: unknown 11762 1726853253.92337: we have included files to process 11762 1726853253.92338: generating all_blocks data 11762 1726853253.92339: done generating all_blocks data 11762 1726853253.92340: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11762 1726853253.92341: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11762 1726853253.92342: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11762 1726853253.92550: done processing included file 11762 1726853253.92551: iterating over new_blocks loaded from include file 11762 1726853253.92552: in VariableManager get_vars() 11762 1726853253.92562: done with get_vars() 11762 1726853253.92564: filtering new block on tags 11762 1726853253.92588: done filtering new block on tags 11762 1726853253.92590: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 11762 1726853253.92593: extending task lists for all hosts with included blocks 11762 1726853253.92684: done extending task lists 11762 1726853253.92685: done processing included files 11762 1726853253.92686: results queue empty 11762 1726853253.92686: checking for any_errors_fatal 11762 1726853253.92688: done checking for any_errors_fatal 11762 1726853253.92689: checking for max_fail_percentage 11762 1726853253.92689: done checking for max_fail_percentage 11762 1726853253.92690: checking to see if all hosts have failed and the running result is not ok 11762 1726853253.92690: done checking to see if all hosts have failed 11762 1726853253.92691: getting the remaining hosts for this loop 11762 1726853253.92692: done getting the remaining hosts for this loop 11762 1726853253.92693: getting the next task for host managed_node2 11762 1726853253.92696: done getting next task for host managed_node2 11762 1726853253.92697: ^ task is: TASK: Gather current interface info 11762 1726853253.92699: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853253.92701: getting variables 11762 1726853253.92701: in VariableManager get_vars() 11762 1726853253.92706: Calling all_inventory to load vars for managed_node2 11762 1726853253.92708: Calling groups_inventory to load vars for managed_node2 11762 1726853253.92709: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853253.92713: Calling all_plugins_play to load vars for managed_node2 11762 1726853253.92714: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853253.92716: Calling groups_plugins_play to load vars for managed_node2 11762 1726853253.92820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853253.92927: done with get_vars() 11762 1726853253.92933: done getting variables 11762 1726853253.92961: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:27:33 -0400 (0:00:00.022) 0:00:04.360 ****** 11762 1726853253.92983: entering _queue_task() for managed_node2/command 11762 1726853253.93187: worker is 1 (out of 1 available) 11762 1726853253.93200: exiting _queue_task() for managed_node2/command 11762 1726853253.93210: done queuing things up, now waiting for results queue to drain 11762 1726853253.93212: waiting for pending results... 11762 1726853253.93344: running TaskExecutor() for managed_node2/TASK: Gather current interface info 11762 1726853253.93414: in run() - task 02083763-bbaf-d845-03d0-0000000000ec 11762 1726853253.93426: variable 'ansible_search_path' from source: unknown 11762 1726853253.93430: variable 'ansible_search_path' from source: unknown 11762 1726853253.93463: calling self._execute() 11762 1726853253.93517: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.93520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.93529: variable 'omit' from source: magic vars 11762 1726853253.93792: variable 'ansible_distribution_major_version' from source: facts 11762 1726853253.93801: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853253.93807: variable 'omit' from source: magic vars 11762 1726853253.93838: variable 'omit' from source: magic vars 11762 1726853253.93863: variable 'omit' from source: magic vars 11762 1726853253.93897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853253.93924: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853253.93939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853253.93954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.93964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853253.93991: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853253.93995: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.93997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.94063: Set connection var ansible_timeout to 10 11762 1726853253.94066: Set connection var ansible_shell_type to sh 11762 1726853253.94069: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853253.94076: Set connection var ansible_shell_executable to /bin/sh 11762 1726853253.94084: Set connection var ansible_pipelining to False 11762 1726853253.94094: Set connection var ansible_connection to ssh 11762 1726853253.94108: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.94111: variable 'ansible_connection' from source: unknown 11762 1726853253.94114: variable 'ansible_module_compression' from source: unknown 11762 1726853253.94116: variable 'ansible_shell_type' from source: unknown 11762 1726853253.94119: variable 'ansible_shell_executable' from source: unknown 11762 1726853253.94121: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853253.94123: variable 'ansible_pipelining' from source: unknown 11762 1726853253.94126: variable 'ansible_timeout' from source: unknown 11762 1726853253.94130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853253.94231: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853253.94239: variable 'omit' from source: magic vars 11762 1726853253.94244: starting attempt loop 11762 1726853253.94250: running the handler 11762 1726853253.94263: _low_level_execute_command(): starting 11762 1726853253.94272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853253.94758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853253.94794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853253.94798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853253.94801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853253.94851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853253.94854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853253.94857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853253.94937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853253.96651: stdout chunk (state=3): >>>/root <<< 11762 1726853253.96751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853253.96782: stderr chunk (state=3): >>><<< 11762 1726853253.96785: stdout chunk (state=3): >>><<< 11762 1726853253.96806: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853253.96821: _low_level_execute_command(): starting 11762 1726853253.96828: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763 `" && echo ansible-tmp-1726853253.9680622-12005-269318836475763="` echo /root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763 `" ) && sleep 0' 11762 1726853253.97285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853253.97288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853253.97299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853253.97301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853253.97347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853253.97352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853253.97355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853253.97423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853253.99431: stdout chunk (state=3): >>>ansible-tmp-1726853253.9680622-12005-269318836475763=/root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763 <<< 11762 1726853253.99534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853253.99567: stderr chunk (state=3): >>><<< 11762 1726853253.99570: stdout chunk (state=3): >>><<< 11762 1726853253.99587: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853253.9680622-12005-269318836475763=/root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853253.99613: variable 'ansible_module_compression' from source: unknown 11762 1726853253.99654: ANSIBALLZ: Using generic lock for ansible.legacy.command 11762 1726853253.99657: ANSIBALLZ: Acquiring lock 11762 1726853253.99660: ANSIBALLZ: Lock acquired: 139956166284816 11762 1726853253.99662: ANSIBALLZ: Creating module 11762 1726853254.11933: ANSIBALLZ: Writing module into payload 11762 1726853254.12077: ANSIBALLZ: Writing module 11762 1726853254.12081: ANSIBALLZ: Renaming module 11762 1726853254.12083: ANSIBALLZ: Done creating module 11762 1726853254.12084: variable 'ansible_facts' from source: unknown 11762 1726853254.12141: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763/AnsiballZ_command.py 11762 1726853254.12448: Sending initial data 11762 1726853254.12452: Sent initial data (156 bytes) 11762 1726853254.13031: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853254.13056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853254.13167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853254.14841: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853254.14908: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853254.14978: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpm_yfbymy /root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763/AnsiballZ_command.py <<< 11762 1726853254.14984: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763/AnsiballZ_command.py" <<< 11762 1726853254.15047: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpm_yfbymy" to remote "/root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763/AnsiballZ_command.py" <<< 11762 1726853254.15053: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763/AnsiballZ_command.py" <<< 11762 1726853254.15960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853254.15964: stdout chunk (state=3): >>><<< 11762 1726853254.15966: stderr chunk (state=3): >>><<< 11762 1726853254.16049: done transferring module to remote 11762 1726853254.16077: _low_level_execute_command(): starting 11762 1726853254.16081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763/ /root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763/AnsiballZ_command.py && sleep 0' 11762 1726853254.16550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853254.16563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853254.16585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853254.16633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853254.16637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853254.16718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853254.18645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853254.18650: stdout chunk (state=3): >>><<< 11762 1726853254.18652: stderr chunk (state=3): >>><<< 11762 1726853254.18669: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853254.18751: _low_level_execute_command(): starting 11762 1726853254.18754: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763/AnsiballZ_command.py && sleep 0' 11762 1726853254.19307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853254.19310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853254.19320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853254.19322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853254.19373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853254.19377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853254.19384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853254.19468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853254.35400: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:27:34.349414", "end": "2024-09-20 13:27:34.352907", "delta": "0:00:00.003493", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853254.37016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853254.37045: stderr chunk (state=3): >>><<< 11762 1726853254.37049: stdout chunk (state=3): >>><<< 11762 1726853254.37065: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:27:34.349414", "end": "2024-09-20 13:27:34.352907", "delta": "0:00:00.003493", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853254.37097: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853254.37108: _low_level_execute_command(): starting 11762 1726853254.37111: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853253.9680622-12005-269318836475763/ > /dev/null 2>&1 && sleep 0' 11762 1726853254.37567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853254.37573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853254.37581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853254.37583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853254.37630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853254.37638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853254.37640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853254.37705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853254.39594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853254.39621: stderr chunk (state=3): >>><<< 11762 1726853254.39624: stdout chunk (state=3): >>><<< 11762 1726853254.39637: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853254.39642: handler run complete 11762 1726853254.39661: Evaluated conditional (False): False 11762 1726853254.39675: attempt loop complete, returning result 11762 1726853254.39678: _execute() done 11762 1726853254.39681: dumping result to json 11762 1726853254.39683: done dumping result, returning 11762 1726853254.39688: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [02083763-bbaf-d845-03d0-0000000000ec] 11762 1726853254.39693: sending task result for task 02083763-bbaf-d845-03d0-0000000000ec 11762 1726853254.39791: done sending task result for task 02083763-bbaf-d845-03d0-0000000000ec 11762 1726853254.39794: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003493", "end": "2024-09-20 13:27:34.352907", "rc": 0, "start": "2024-09-20 13:27:34.349414" } STDOUT: eth0 lo 11762 1726853254.39881: no more pending results, returning what we have 11762 1726853254.39884: results queue empty 11762 1726853254.39885: checking for any_errors_fatal 11762 1726853254.39886: done checking for any_errors_fatal 11762 1726853254.39887: checking for max_fail_percentage 11762 1726853254.39888: done checking for max_fail_percentage 11762 1726853254.39889: checking to see if all hosts have failed and the running result is not ok 11762 1726853254.39890: done checking to see if all hosts have failed 11762 1726853254.39890: getting the remaining hosts for this loop 11762 1726853254.39892: done getting the remaining hosts for this loop 11762 1726853254.39895: getting the next task for host managed_node2 11762 1726853254.39902: done getting next task for host managed_node2 11762 1726853254.39905: ^ task is: TASK: Set current_interfaces 11762 1726853254.39909: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853254.39913: getting variables 11762 1726853254.39914: in VariableManager get_vars() 11762 1726853254.39946: Calling all_inventory to load vars for managed_node2 11762 1726853254.39949: Calling groups_inventory to load vars for managed_node2 11762 1726853254.39952: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853254.39962: Calling all_plugins_play to load vars for managed_node2 11762 1726853254.39964: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853254.39967: Calling groups_plugins_play to load vars for managed_node2 11762 1726853254.40121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853254.40260: done with get_vars() 11762 1726853254.40267: done getting variables 11762 1726853254.40311: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:27:34 -0400 (0:00:00.473) 0:00:04.833 ****** 11762 1726853254.40332: entering _queue_task() for managed_node2/set_fact 11762 1726853254.40530: worker is 1 (out of 1 available) 11762 1726853254.40546: exiting _queue_task() for managed_node2/set_fact 11762 1726853254.40557: done queuing things up, now waiting for results queue to drain 11762 1726853254.40559: waiting for pending results... 11762 1726853254.40702: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 11762 1726853254.40761: in run() - task 02083763-bbaf-d845-03d0-0000000000ed 11762 1726853254.40774: variable 'ansible_search_path' from source: unknown 11762 1726853254.40778: variable 'ansible_search_path' from source: unknown 11762 1726853254.40807: calling self._execute() 11762 1726853254.40862: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.40865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.40876: variable 'omit' from source: magic vars 11762 1726853254.41138: variable 'ansible_distribution_major_version' from source: facts 11762 1726853254.41149: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853254.41154: variable 'omit' from source: magic vars 11762 1726853254.41188: variable 'omit' from source: magic vars 11762 1726853254.41260: variable '_current_interfaces' from source: set_fact 11762 1726853254.41307: variable 'omit' from source: magic vars 11762 1726853254.41338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853254.41366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853254.41383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853254.41396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853254.41406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853254.41428: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853254.41431: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.41436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.41502: Set connection var ansible_timeout to 10 11762 1726853254.41505: Set connection var ansible_shell_type to sh 11762 1726853254.41510: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853254.41515: Set connection var ansible_shell_executable to /bin/sh 11762 1726853254.41522: Set connection var ansible_pipelining to False 11762 1726853254.41528: Set connection var ansible_connection to ssh 11762 1726853254.41543: variable 'ansible_shell_executable' from source: unknown 11762 1726853254.41550: variable 'ansible_connection' from source: unknown 11762 1726853254.41553: variable 'ansible_module_compression' from source: unknown 11762 1726853254.41555: variable 'ansible_shell_type' from source: unknown 11762 1726853254.41557: variable 'ansible_shell_executable' from source: unknown 11762 1726853254.41559: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.41562: variable 'ansible_pipelining' from source: unknown 11762 1726853254.41564: variable 'ansible_timeout' from source: unknown 11762 1726853254.41565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.41662: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853254.41672: variable 'omit' from source: magic vars 11762 1726853254.41677: starting attempt loop 11762 1726853254.41686: running the handler 11762 1726853254.41695: handler run complete 11762 1726853254.41702: attempt loop complete, returning result 11762 1726853254.41705: _execute() done 11762 1726853254.41708: dumping result to json 11762 1726853254.41710: done dumping result, returning 11762 1726853254.41717: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [02083763-bbaf-d845-03d0-0000000000ed] 11762 1726853254.41722: sending task result for task 02083763-bbaf-d845-03d0-0000000000ed 11762 1726853254.41802: done sending task result for task 02083763-bbaf-d845-03d0-0000000000ed 11762 1726853254.41805: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 11762 1726853254.41858: no more pending results, returning what we have 11762 1726853254.41861: results queue empty 11762 1726853254.41862: checking for any_errors_fatal 11762 1726853254.41872: done checking for any_errors_fatal 11762 1726853254.41873: checking for max_fail_percentage 11762 1726853254.41875: done checking for max_fail_percentage 11762 1726853254.41876: checking to see if all hosts have failed and the running result is not ok 11762 1726853254.41877: done checking to see if all hosts have failed 11762 1726853254.41878: getting the remaining hosts for this loop 11762 1726853254.41880: done getting the remaining hosts for this loop 11762 1726853254.41883: getting the next task for host managed_node2 11762 1726853254.41890: done getting next task for host managed_node2 11762 1726853254.41893: ^ task is: TASK: Show current_interfaces 11762 1726853254.41896: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853254.41901: getting variables 11762 1726853254.41902: in VariableManager get_vars() 11762 1726853254.41929: Calling all_inventory to load vars for managed_node2 11762 1726853254.41931: Calling groups_inventory to load vars for managed_node2 11762 1726853254.41934: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853254.41942: Calling all_plugins_play to load vars for managed_node2 11762 1726853254.41944: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853254.41947: Calling groups_plugins_play to load vars for managed_node2 11762 1726853254.42087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853254.42202: done with get_vars() 11762 1726853254.42209: done getting variables 11762 1726853254.42249: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:27:34 -0400 (0:00:00.019) 0:00:04.853 ****** 11762 1726853254.42269: entering _queue_task() for managed_node2/debug 11762 1726853254.42458: worker is 1 (out of 1 available) 11762 1726853254.42474: exiting _queue_task() for managed_node2/debug 11762 1726853254.42487: done queuing things up, now waiting for results queue to drain 11762 1726853254.42489: waiting for pending results... 11762 1726853254.42627: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 11762 1726853254.42686: in run() - task 02083763-bbaf-d845-03d0-0000000000b2 11762 1726853254.42697: variable 'ansible_search_path' from source: unknown 11762 1726853254.42700: variable 'ansible_search_path' from source: unknown 11762 1726853254.42729: calling self._execute() 11762 1726853254.42788: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.42793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.42800: variable 'omit' from source: magic vars 11762 1726853254.43056: variable 'ansible_distribution_major_version' from source: facts 11762 1726853254.43068: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853254.43076: variable 'omit' from source: magic vars 11762 1726853254.43105: variable 'omit' from source: magic vars 11762 1726853254.43174: variable 'current_interfaces' from source: set_fact 11762 1726853254.43194: variable 'omit' from source: magic vars 11762 1726853254.43223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853254.43254: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853254.43269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853254.43375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853254.43378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853254.43380: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853254.43382: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.43384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.43447: Set connection var ansible_timeout to 10 11762 1726853254.43456: Set connection var ansible_shell_type to sh 11762 1726853254.43466: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853254.43479: Set connection var ansible_shell_executable to /bin/sh 11762 1726853254.43493: Set connection var ansible_pipelining to False 11762 1726853254.43503: Set connection var ansible_connection to ssh 11762 1726853254.43527: variable 'ansible_shell_executable' from source: unknown 11762 1726853254.43535: variable 'ansible_connection' from source: unknown 11762 1726853254.43543: variable 'ansible_module_compression' from source: unknown 11762 1726853254.43553: variable 'ansible_shell_type' from source: unknown 11762 1726853254.43560: variable 'ansible_shell_executable' from source: unknown 11762 1726853254.43567: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.43577: variable 'ansible_pipelining' from source: unknown 11762 1726853254.43583: variable 'ansible_timeout' from source: unknown 11762 1726853254.43591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.43735: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853254.43756: variable 'omit' from source: magic vars 11762 1726853254.43875: starting attempt loop 11762 1726853254.43879: running the handler 11762 1726853254.43881: handler run complete 11762 1726853254.43883: attempt loop complete, returning result 11762 1726853254.43884: _execute() done 11762 1726853254.43886: dumping result to json 11762 1726853254.43888: done dumping result, returning 11762 1726853254.43890: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [02083763-bbaf-d845-03d0-0000000000b2] 11762 1726853254.43892: sending task result for task 02083763-bbaf-d845-03d0-0000000000b2 ok: [managed_node2] => {} MSG: current_interfaces: ['eth0', 'lo'] 11762 1726853254.44002: no more pending results, returning what we have 11762 1726853254.44005: results queue empty 11762 1726853254.44006: checking for any_errors_fatal 11762 1726853254.44009: done checking for any_errors_fatal 11762 1726853254.44010: checking for max_fail_percentage 11762 1726853254.44012: done checking for max_fail_percentage 11762 1726853254.44013: checking to see if all hosts have failed and the running result is not ok 11762 1726853254.44014: done checking to see if all hosts have failed 11762 1726853254.44014: getting the remaining hosts for this loop 11762 1726853254.44017: done getting the remaining hosts for this loop 11762 1726853254.44020: getting the next task for host managed_node2 11762 1726853254.44027: done getting next task for host managed_node2 11762 1726853254.44030: ^ task is: TASK: Setup 11762 1726853254.44033: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853254.44037: getting variables 11762 1726853254.44038: in VariableManager get_vars() 11762 1726853254.44070: Calling all_inventory to load vars for managed_node2 11762 1726853254.44074: Calling groups_inventory to load vars for managed_node2 11762 1726853254.44078: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853254.44089: Calling all_plugins_play to load vars for managed_node2 11762 1726853254.44092: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853254.44095: Calling groups_plugins_play to load vars for managed_node2 11762 1726853254.44524: done sending task result for task 02083763-bbaf-d845-03d0-0000000000b2 11762 1726853254.44527: WORKER PROCESS EXITING 11762 1726853254.44548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853254.44670: done with get_vars() 11762 1726853254.44678: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 13:27:34 -0400 (0:00:00.024) 0:00:04.877 ****** 11762 1726853254.44738: entering _queue_task() for managed_node2/include_tasks 11762 1726853254.44930: worker is 1 (out of 1 available) 11762 1726853254.44946: exiting _queue_task() for managed_node2/include_tasks 11762 1726853254.44958: done queuing things up, now waiting for results queue to drain 11762 1726853254.44960: waiting for pending results... 11762 1726853254.45100: running TaskExecutor() for managed_node2/TASK: Setup 11762 1726853254.45158: in run() - task 02083763-bbaf-d845-03d0-00000000008b 11762 1726853254.45172: variable 'ansible_search_path' from source: unknown 11762 1726853254.45175: variable 'ansible_search_path' from source: unknown 11762 1726853254.45212: variable 'lsr_setup' from source: include params 11762 1726853254.45362: variable 'lsr_setup' from source: include params 11762 1726853254.45418: variable 'omit' from source: magic vars 11762 1726853254.45499: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.45508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.45518: variable 'omit' from source: magic vars 11762 1726853254.45673: variable 'ansible_distribution_major_version' from source: facts 11762 1726853254.45680: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853254.45686: variable 'item' from source: unknown 11762 1726853254.45737: variable 'item' from source: unknown 11762 1726853254.45758: variable 'item' from source: unknown 11762 1726853254.45801: variable 'item' from source: unknown 11762 1726853254.45915: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.45918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.45920: variable 'omit' from source: magic vars 11762 1726853254.45998: variable 'ansible_distribution_major_version' from source: facts 11762 1726853254.46001: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853254.46007: variable 'item' from source: unknown 11762 1726853254.46052: variable 'item' from source: unknown 11762 1726853254.46073: variable 'item' from source: unknown 11762 1726853254.46114: variable 'item' from source: unknown 11762 1726853254.46184: dumping result to json 11762 1726853254.46187: done dumping result, returning 11762 1726853254.46189: done running TaskExecutor() for managed_node2/TASK: Setup [02083763-bbaf-d845-03d0-00000000008b] 11762 1726853254.46191: sending task result for task 02083763-bbaf-d845-03d0-00000000008b 11762 1726853254.46223: done sending task result for task 02083763-bbaf-d845-03d0-00000000008b 11762 1726853254.46225: WORKER PROCESS EXITING 11762 1726853254.46311: no more pending results, returning what we have 11762 1726853254.46315: in VariableManager get_vars() 11762 1726853254.46348: Calling all_inventory to load vars for managed_node2 11762 1726853254.46351: Calling groups_inventory to load vars for managed_node2 11762 1726853254.46354: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853254.46366: Calling all_plugins_play to load vars for managed_node2 11762 1726853254.46369: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853254.46376: Calling groups_plugins_play to load vars for managed_node2 11762 1726853254.46672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853254.47065: done with get_vars() 11762 1726853254.47074: variable 'ansible_search_path' from source: unknown 11762 1726853254.47075: variable 'ansible_search_path' from source: unknown 11762 1726853254.47118: variable 'ansible_search_path' from source: unknown 11762 1726853254.47119: variable 'ansible_search_path' from source: unknown 11762 1726853254.47148: we have included files to process 11762 1726853254.47149: generating all_blocks data 11762 1726853254.47151: done generating all_blocks data 11762 1726853254.47154: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11762 1726853254.47156: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11762 1726853254.47158: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11762 1726853254.48275: done processing included file 11762 1726853254.48277: iterating over new_blocks loaded from include file 11762 1726853254.48279: in VariableManager get_vars() 11762 1726853254.48296: done with get_vars() 11762 1726853254.48298: filtering new block on tags 11762 1726853254.48348: done filtering new block on tags 11762 1726853254.48351: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed_node2 => (item=tasks/create_test_interfaces_with_dhcp.yml) 11762 1726853254.48355: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11762 1726853254.48356: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11762 1726853254.48359: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11762 1726853254.48489: in VariableManager get_vars() 11762 1726853254.48509: done with get_vars() 11762 1726853254.48516: variable 'item' from source: include params 11762 1726853254.48622: variable 'item' from source: include params 11762 1726853254.48659: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11762 1726853254.48774: in VariableManager get_vars() 11762 1726853254.48792: done with get_vars() 11762 1726853254.48933: in VariableManager get_vars() 11762 1726853254.48957: done with get_vars() 11762 1726853254.48963: variable 'item' from source: include params 11762 1726853254.49023: variable 'item' from source: include params 11762 1726853254.49059: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11762 1726853254.49133: in VariableManager get_vars() 11762 1726853254.49154: done with get_vars() 11762 1726853254.49255: done processing included file 11762 1726853254.49257: iterating over new_blocks loaded from include file 11762 1726853254.49258: in VariableManager get_vars() 11762 1726853254.49276: done with get_vars() 11762 1726853254.49278: filtering new block on tags 11762 1726853254.49378: done filtering new block on tags 11762 1726853254.49382: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed_node2 => (item=tasks/assert_dhcp_device_present.yml) 11762 1726853254.49386: extending task lists for all hosts with included blocks 11762 1726853254.50006: done extending task lists 11762 1726853254.50007: done processing included files 11762 1726853254.50008: results queue empty 11762 1726853254.50009: checking for any_errors_fatal 11762 1726853254.50012: done checking for any_errors_fatal 11762 1726853254.50013: checking for max_fail_percentage 11762 1726853254.50014: done checking for max_fail_percentage 11762 1726853254.50014: checking to see if all hosts have failed and the running result is not ok 11762 1726853254.50015: done checking to see if all hosts have failed 11762 1726853254.50019: getting the remaining hosts for this loop 11762 1726853254.50020: done getting the remaining hosts for this loop 11762 1726853254.50027: getting the next task for host managed_node2 11762 1726853254.50030: done getting next task for host managed_node2 11762 1726853254.50032: ^ task is: TASK: Install dnsmasq 11762 1726853254.50034: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853254.50037: getting variables 11762 1726853254.50038: in VariableManager get_vars() 11762 1726853254.50047: Calling all_inventory to load vars for managed_node2 11762 1726853254.50049: Calling groups_inventory to load vars for managed_node2 11762 1726853254.50051: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853254.50056: Calling all_plugins_play to load vars for managed_node2 11762 1726853254.50058: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853254.50060: Calling groups_plugins_play to load vars for managed_node2 11762 1726853254.50188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853254.50396: done with get_vars() 11762 1726853254.50405: done getting variables 11762 1726853254.50448: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:27:34 -0400 (0:00:00.057) 0:00:04.935 ****** 11762 1726853254.50484: entering _queue_task() for managed_node2/package 11762 1726853254.50920: worker is 1 (out of 1 available) 11762 1726853254.50933: exiting _queue_task() for managed_node2/package 11762 1726853254.50947: done queuing things up, now waiting for results queue to drain 11762 1726853254.50949: waiting for pending results... 11762 1726853254.51135: running TaskExecutor() for managed_node2/TASK: Install dnsmasq 11762 1726853254.51285: in run() - task 02083763-bbaf-d845-03d0-000000000112 11762 1726853254.51289: variable 'ansible_search_path' from source: unknown 11762 1726853254.51292: variable 'ansible_search_path' from source: unknown 11762 1726853254.51321: calling self._execute() 11762 1726853254.51558: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.51562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.51565: variable 'omit' from source: magic vars 11762 1726853254.51865: variable 'ansible_distribution_major_version' from source: facts 11762 1726853254.51894: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853254.51906: variable 'omit' from source: magic vars 11762 1726853254.51953: variable 'omit' from source: magic vars 11762 1726853254.52168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853254.54342: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853254.54434: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853254.54480: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853254.54524: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853254.54556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853254.54711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853254.54715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853254.54729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853254.54776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853254.54794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853254.54911: variable '__network_is_ostree' from source: set_fact 11762 1726853254.54936: variable 'omit' from source: magic vars 11762 1726853254.54967: variable 'omit' from source: magic vars 11762 1726853254.55075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853254.55079: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853254.55081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853254.55084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853254.55086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853254.55118: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853254.55125: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.55132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.55239: Set connection var ansible_timeout to 10 11762 1726853254.55275: Set connection var ansible_shell_type to sh 11762 1726853254.55278: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853254.55280: Set connection var ansible_shell_executable to /bin/sh 11762 1726853254.55282: Set connection var ansible_pipelining to False 11762 1726853254.55286: Set connection var ansible_connection to ssh 11762 1726853254.55318: variable 'ansible_shell_executable' from source: unknown 11762 1726853254.55326: variable 'ansible_connection' from source: unknown 11762 1726853254.55332: variable 'ansible_module_compression' from source: unknown 11762 1726853254.55418: variable 'ansible_shell_type' from source: unknown 11762 1726853254.55422: variable 'ansible_shell_executable' from source: unknown 11762 1726853254.55424: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853254.55426: variable 'ansible_pipelining' from source: unknown 11762 1726853254.55428: variable 'ansible_timeout' from source: unknown 11762 1726853254.55431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853254.55479: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853254.55494: variable 'omit' from source: magic vars 11762 1726853254.55503: starting attempt loop 11762 1726853254.55509: running the handler 11762 1726853254.55523: variable 'ansible_facts' from source: unknown 11762 1726853254.55530: variable 'ansible_facts' from source: unknown 11762 1726853254.55589: _low_level_execute_command(): starting 11762 1726853254.55635: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853254.56320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853254.56324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853254.56331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853254.56475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853254.58179: stdout chunk (state=3): >>>/root <<< 11762 1726853254.58326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853254.58341: stderr chunk (state=3): >>><<< 11762 1726853254.58356: stdout chunk (state=3): >>><<< 11762 1726853254.58397: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853254.58507: _low_level_execute_command(): starting 11762 1726853254.58510: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190 `" && echo ansible-tmp-1726853254.5841324-12027-42474683391190="` echo /root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190 `" ) && sleep 0' 11762 1726853254.59080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853254.59097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853254.59137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853254.59149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853254.59242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853254.59265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853254.59375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853254.61414: stdout chunk (state=3): >>>ansible-tmp-1726853254.5841324-12027-42474683391190=/root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190 <<< 11762 1726853254.61561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853254.61583: stderr chunk (state=3): >>><<< 11762 1726853254.61597: stdout chunk (state=3): >>><<< 11762 1726853254.61655: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853254.5841324-12027-42474683391190=/root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853254.61660: variable 'ansible_module_compression' from source: unknown 11762 1726853254.61728: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11762 1726853254.61736: ANSIBALLZ: Acquiring lock 11762 1726853254.61742: ANSIBALLZ: Lock acquired: 139956166284816 11762 1726853254.61753: ANSIBALLZ: Creating module 11762 1726853254.80451: ANSIBALLZ: Writing module into payload 11762 1726853254.80608: ANSIBALLZ: Writing module 11762 1726853254.80635: ANSIBALLZ: Renaming module 11762 1726853254.80657: ANSIBALLZ: Done creating module 11762 1726853254.80690: variable 'ansible_facts' from source: unknown 11762 1726853254.80877: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190/AnsiballZ_dnf.py 11762 1726853254.81022: Sending initial data 11762 1726853254.81025: Sent initial data (151 bytes) 11762 1726853254.81750: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853254.81781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853254.81904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853254.83608: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853254.83694: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853254.83806: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpgc4jd13j /root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190/AnsiballZ_dnf.py <<< 11762 1726853254.83809: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190/AnsiballZ_dnf.py" <<< 11762 1726853254.83862: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpgc4jd13j" to remote "/root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190/AnsiballZ_dnf.py" <<< 11762 1726853254.85196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853254.85387: stderr chunk (state=3): >>><<< 11762 1726853254.85391: stdout chunk (state=3): >>><<< 11762 1726853254.85393: done transferring module to remote 11762 1726853254.85394: _low_level_execute_command(): starting 11762 1726853254.85396: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190/ /root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190/AnsiballZ_dnf.py && sleep 0' 11762 1726853254.86151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853254.86155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853254.86158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853254.86160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853254.86163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853254.86223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853254.86227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853254.86274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853254.86349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853254.88478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853254.88486: stdout chunk (state=3): >>><<< 11762 1726853254.88489: stderr chunk (state=3): >>><<< 11762 1726853254.88491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853254.88493: _low_level_execute_command(): starting 11762 1726853254.88496: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190/AnsiballZ_dnf.py && sleep 0' 11762 1726853254.89142: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853254.89145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853254.89148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853254.89150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853254.89152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853254.89154: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853254.89187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853254.89198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853254.89211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853254.89236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853254.89343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853256.72542: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11762 1726853256.80119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853256.80140: stderr chunk (state=3): >>><<< 11762 1726853256.80146: stdout chunk (state=3): >>><<< 11762 1726853256.80163: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853256.80200: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853256.80206: _low_level_execute_command(): starting 11762 1726853256.80211: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853254.5841324-12027-42474683391190/ > /dev/null 2>&1 && sleep 0' 11762 1726853256.80646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853256.80650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853256.80654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853256.80656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853256.80658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853256.80705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853256.80709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853256.80713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853256.80802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853256.82773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853256.82796: stderr chunk (state=3): >>><<< 11762 1726853256.82799: stdout chunk (state=3): >>><<< 11762 1726853256.82813: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853256.82819: handler run complete 11762 1726853256.82932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853256.83056: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853256.83087: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853256.83110: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853256.83131: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853256.83189: variable '__install_status' from source: unknown 11762 1726853256.83200: Evaluated conditional (__install_status is success): True 11762 1726853256.83212: attempt loop complete, returning result 11762 1726853256.83215: _execute() done 11762 1726853256.83217: dumping result to json 11762 1726853256.83222: done dumping result, returning 11762 1726853256.83229: done running TaskExecutor() for managed_node2/TASK: Install dnsmasq [02083763-bbaf-d845-03d0-000000000112] 11762 1726853256.83234: sending task result for task 02083763-bbaf-d845-03d0-000000000112 11762 1726853256.83369: done sending task result for task 02083763-bbaf-d845-03d0-000000000112 11762 1726853256.83374: WORKER PROCESS EXITING changed: [managed_node2] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.90-3.el10.x86_64" ] } 11762 1726853256.83455: no more pending results, returning what we have 11762 1726853256.83458: results queue empty 11762 1726853256.83459: checking for any_errors_fatal 11762 1726853256.83460: done checking for any_errors_fatal 11762 1726853256.83461: checking for max_fail_percentage 11762 1726853256.83462: done checking for max_fail_percentage 11762 1726853256.83463: checking to see if all hosts have failed and the running result is not ok 11762 1726853256.83464: done checking to see if all hosts have failed 11762 1726853256.83464: getting the remaining hosts for this loop 11762 1726853256.83466: done getting the remaining hosts for this loop 11762 1726853256.83469: getting the next task for host managed_node2 11762 1726853256.83476: done getting next task for host managed_node2 11762 1726853256.83478: ^ task is: TASK: Install pgrep, sysctl 11762 1726853256.83481: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853256.83485: getting variables 11762 1726853256.83487: in VariableManager get_vars() 11762 1726853256.83512: Calling all_inventory to load vars for managed_node2 11762 1726853256.83514: Calling groups_inventory to load vars for managed_node2 11762 1726853256.83517: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853256.83526: Calling all_plugins_play to load vars for managed_node2 11762 1726853256.83529: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853256.83532: Calling groups_plugins_play to load vars for managed_node2 11762 1726853256.83687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853256.83901: done with get_vars() 11762 1726853256.83911: done getting variables 11762 1726853256.83973: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 13:27:36 -0400 (0:00:02.335) 0:00:07.270 ****** 11762 1726853256.84004: entering _queue_task() for managed_node2/package 11762 1726853256.84497: worker is 1 (out of 1 available) 11762 1726853256.84505: exiting _queue_task() for managed_node2/package 11762 1726853256.84514: done queuing things up, now waiting for results queue to drain 11762 1726853256.84516: waiting for pending results... 11762 1726853256.84537: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 11762 1726853256.84647: in run() - task 02083763-bbaf-d845-03d0-000000000113 11762 1726853256.84667: variable 'ansible_search_path' from source: unknown 11762 1726853256.84683: variable 'ansible_search_path' from source: unknown 11762 1726853256.84710: calling self._execute() 11762 1726853256.84769: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853256.84776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853256.84784: variable 'omit' from source: magic vars 11762 1726853256.85049: variable 'ansible_distribution_major_version' from source: facts 11762 1726853256.85059: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853256.85134: variable 'ansible_os_family' from source: facts 11762 1726853256.85138: Evaluated conditional (ansible_os_family == 'RedHat'): True 11762 1726853256.85256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853256.85473: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853256.85506: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853256.85530: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853256.85555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853256.85611: variable 'ansible_distribution_major_version' from source: facts 11762 1726853256.85622: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11762 1726853256.85625: when evaluation is False, skipping this task 11762 1726853256.85627: _execute() done 11762 1726853256.85630: dumping result to json 11762 1726853256.85632: done dumping result, returning 11762 1726853256.85638: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [02083763-bbaf-d845-03d0-000000000113] 11762 1726853256.85643: sending task result for task 02083763-bbaf-d845-03d0-000000000113 11762 1726853256.85724: done sending task result for task 02083763-bbaf-d845-03d0-000000000113 11762 1726853256.85727: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11762 1726853256.85773: no more pending results, returning what we have 11762 1726853256.85777: results queue empty 11762 1726853256.85777: checking for any_errors_fatal 11762 1726853256.85784: done checking for any_errors_fatal 11762 1726853256.85785: checking for max_fail_percentage 11762 1726853256.85787: done checking for max_fail_percentage 11762 1726853256.85788: checking to see if all hosts have failed and the running result is not ok 11762 1726853256.85789: done checking to see if all hosts have failed 11762 1726853256.85789: getting the remaining hosts for this loop 11762 1726853256.85791: done getting the remaining hosts for this loop 11762 1726853256.85794: getting the next task for host managed_node2 11762 1726853256.85800: done getting next task for host managed_node2 11762 1726853256.85802: ^ task is: TASK: Install pgrep, sysctl 11762 1726853256.85805: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853256.85807: getting variables 11762 1726853256.85808: in VariableManager get_vars() 11762 1726853256.85833: Calling all_inventory to load vars for managed_node2 11762 1726853256.85837: Calling groups_inventory to load vars for managed_node2 11762 1726853256.85839: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853256.85849: Calling all_plugins_play to load vars for managed_node2 11762 1726853256.85852: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853256.85854: Calling groups_plugins_play to load vars for managed_node2 11762 1726853256.85979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853256.86119: done with get_vars() 11762 1726853256.86126: done getting variables 11762 1726853256.86166: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 13:27:36 -0400 (0:00:00.021) 0:00:07.292 ****** 11762 1726853256.86189: entering _queue_task() for managed_node2/package 11762 1726853256.86374: worker is 1 (out of 1 available) 11762 1726853256.86387: exiting _queue_task() for managed_node2/package 11762 1726853256.86398: done queuing things up, now waiting for results queue to drain 11762 1726853256.86400: waiting for pending results... 11762 1726853256.86536: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 11762 1726853256.86594: in run() - task 02083763-bbaf-d845-03d0-000000000114 11762 1726853256.86605: variable 'ansible_search_path' from source: unknown 11762 1726853256.86608: variable 'ansible_search_path' from source: unknown 11762 1726853256.86635: calling self._execute() 11762 1726853256.86691: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853256.86694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853256.86702: variable 'omit' from source: magic vars 11762 1726853256.86948: variable 'ansible_distribution_major_version' from source: facts 11762 1726853256.86956: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853256.87033: variable 'ansible_os_family' from source: facts 11762 1726853256.87036: Evaluated conditional (ansible_os_family == 'RedHat'): True 11762 1726853256.87153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853256.87333: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853256.87363: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853256.87389: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853256.87414: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853256.87467: variable 'ansible_distribution_major_version' from source: facts 11762 1726853256.87478: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11762 1726853256.87483: variable 'omit' from source: magic vars 11762 1726853256.87510: variable 'omit' from source: magic vars 11762 1726853256.87608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853256.89121: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853256.89164: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853256.89192: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853256.89214: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853256.89234: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853256.89304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853256.89323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853256.89342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853256.89373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853256.89384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853256.89474: variable '__network_is_ostree' from source: set_fact 11762 1726853256.89477: variable 'omit' from source: magic vars 11762 1726853256.89480: variable 'omit' from source: magic vars 11762 1726853256.89495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853256.89515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853256.89529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853256.89541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853256.89551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853256.89578: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853256.89582: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853256.89584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853256.89648: Set connection var ansible_timeout to 10 11762 1726853256.89652: Set connection var ansible_shell_type to sh 11762 1726853256.89660: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853256.89666: Set connection var ansible_shell_executable to /bin/sh 11762 1726853256.89676: Set connection var ansible_pipelining to False 11762 1726853256.89686: Set connection var ansible_connection to ssh 11762 1726853256.89715: variable 'ansible_shell_executable' from source: unknown 11762 1726853256.89718: variable 'ansible_connection' from source: unknown 11762 1726853256.89721: variable 'ansible_module_compression' from source: unknown 11762 1726853256.89724: variable 'ansible_shell_type' from source: unknown 11762 1726853256.89726: variable 'ansible_shell_executable' from source: unknown 11762 1726853256.89728: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853256.89733: variable 'ansible_pipelining' from source: unknown 11762 1726853256.89735: variable 'ansible_timeout' from source: unknown 11762 1726853256.89739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853256.89813: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853256.89823: variable 'omit' from source: magic vars 11762 1726853256.89931: starting attempt loop 11762 1726853256.89935: running the handler 11762 1726853256.89938: variable 'ansible_facts' from source: unknown 11762 1726853256.89940: variable 'ansible_facts' from source: unknown 11762 1726853256.89942: _low_level_execute_command(): starting 11762 1726853256.89944: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853256.90779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853256.90784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853256.90788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853256.90891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853256.90960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853256.92737: stdout chunk (state=3): >>>/root <<< 11762 1726853256.92867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853256.92906: stderr chunk (state=3): >>><<< 11762 1726853256.92910: stdout chunk (state=3): >>><<< 11762 1726853256.92932: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853256.93042: _low_level_execute_command(): starting 11762 1726853256.93049: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602 `" && echo ansible-tmp-1726853256.9294076-12136-252015295339602="` echo /root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602 `" ) && sleep 0' 11762 1726853256.93595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853256.93620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853256.93633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853256.93720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853256.95754: stdout chunk (state=3): >>>ansible-tmp-1726853256.9294076-12136-252015295339602=/root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602 <<< 11762 1726853256.95867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853256.95893: stderr chunk (state=3): >>><<< 11762 1726853256.95897: stdout chunk (state=3): >>><<< 11762 1726853256.95911: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853256.9294076-12136-252015295339602=/root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853256.95937: variable 'ansible_module_compression' from source: unknown 11762 1726853256.95987: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11762 1726853256.96022: variable 'ansible_facts' from source: unknown 11762 1726853256.96107: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602/AnsiballZ_dnf.py 11762 1726853256.96207: Sending initial data 11762 1726853256.96211: Sent initial data (152 bytes) 11762 1726853256.96662: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853256.96666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853256.96668: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853256.96672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853256.96726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853256.96734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853256.96737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853256.96805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853256.98467: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11762 1726853256.98477: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853256.98534: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853256.98604: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpnw72_m7t /root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602/AnsiballZ_dnf.py <<< 11762 1726853256.98611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602/AnsiballZ_dnf.py" <<< 11762 1726853256.98677: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpnw72_m7t" to remote "/root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602/AnsiballZ_dnf.py" <<< 11762 1726853256.98679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602/AnsiballZ_dnf.py" <<< 11762 1726853256.99482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853256.99520: stderr chunk (state=3): >>><<< 11762 1726853256.99524: stdout chunk (state=3): >>><<< 11762 1726853256.99576: done transferring module to remote 11762 1726853256.99579: _low_level_execute_command(): starting 11762 1726853256.99582: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602/ /root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602/AnsiballZ_dnf.py && sleep 0' 11762 1726853257.00019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853257.00022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853257.00024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.00026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853257.00028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853257.00030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.00083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853257.00087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853257.00164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853257.02061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853257.02090: stderr chunk (state=3): >>><<< 11762 1726853257.02093: stdout chunk (state=3): >>><<< 11762 1726853257.02109: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853257.02112: _low_level_execute_command(): starting 11762 1726853257.02117: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602/AnsiballZ_dnf.py && sleep 0' 11762 1726853257.02551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853257.02554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.02556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853257.02559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.02612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853257.02616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853257.02620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853257.02697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853257.48394: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11762 1726853257.53028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853257.53039: stderr chunk (state=3): >>><<< 11762 1726853257.53042: stdout chunk (state=3): >>><<< 11762 1726853257.53060: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853257.53097: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853257.53102: _low_level_execute_command(): starting 11762 1726853257.53108: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853256.9294076-12136-252015295339602/ > /dev/null 2>&1 && sleep 0' 11762 1726853257.53559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853257.53563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.53565: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853257.53568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853257.53571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.53621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853257.53627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853257.53629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853257.53698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853257.55642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853257.55667: stderr chunk (state=3): >>><<< 11762 1726853257.55670: stdout chunk (state=3): >>><<< 11762 1726853257.55684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853257.55690: handler run complete 11762 1726853257.55714: attempt loop complete, returning result 11762 1726853257.55717: _execute() done 11762 1726853257.55719: dumping result to json 11762 1726853257.55726: done dumping result, returning 11762 1726853257.55735: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [02083763-bbaf-d845-03d0-000000000114] 11762 1726853257.55737: sending task result for task 02083763-bbaf-d845-03d0-000000000114 11762 1726853257.55829: done sending task result for task 02083763-bbaf-d845-03d0-000000000114 11762 1726853257.55832: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11762 1726853257.55899: no more pending results, returning what we have 11762 1726853257.55902: results queue empty 11762 1726853257.55903: checking for any_errors_fatal 11762 1726853257.55907: done checking for any_errors_fatal 11762 1726853257.55908: checking for max_fail_percentage 11762 1726853257.55909: done checking for max_fail_percentage 11762 1726853257.55910: checking to see if all hosts have failed and the running result is not ok 11762 1726853257.55911: done checking to see if all hosts have failed 11762 1726853257.55911: getting the remaining hosts for this loop 11762 1726853257.55913: done getting the remaining hosts for this loop 11762 1726853257.55916: getting the next task for host managed_node2 11762 1726853257.55923: done getting next task for host managed_node2 11762 1726853257.55925: ^ task is: TASK: Create test interfaces 11762 1726853257.55928: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853257.55930: getting variables 11762 1726853257.55932: in VariableManager get_vars() 11762 1726853257.55960: Calling all_inventory to load vars for managed_node2 11762 1726853257.55963: Calling groups_inventory to load vars for managed_node2 11762 1726853257.55966: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853257.55984: Calling all_plugins_play to load vars for managed_node2 11762 1726853257.55987: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853257.55991: Calling groups_plugins_play to load vars for managed_node2 11762 1726853257.56137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853257.56255: done with get_vars() 11762 1726853257.56263: done getting variables 11762 1726853257.56333: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 13:27:37 -0400 (0:00:00.701) 0:00:07.993 ****** 11762 1726853257.56354: entering _queue_task() for managed_node2/shell 11762 1726853257.56359: Creating lock for shell 11762 1726853257.56549: worker is 1 (out of 1 available) 11762 1726853257.56564: exiting _queue_task() for managed_node2/shell 11762 1726853257.56577: done queuing things up, now waiting for results queue to drain 11762 1726853257.56579: waiting for pending results... 11762 1726853257.56729: running TaskExecutor() for managed_node2/TASK: Create test interfaces 11762 1726853257.56792: in run() - task 02083763-bbaf-d845-03d0-000000000115 11762 1726853257.56810: variable 'ansible_search_path' from source: unknown 11762 1726853257.56814: variable 'ansible_search_path' from source: unknown 11762 1726853257.56835: calling self._execute() 11762 1726853257.56892: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853257.56896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853257.56904: variable 'omit' from source: magic vars 11762 1726853257.57161: variable 'ansible_distribution_major_version' from source: facts 11762 1726853257.57170: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853257.57177: variable 'omit' from source: magic vars 11762 1726853257.57207: variable 'omit' from source: magic vars 11762 1726853257.57490: variable 'dhcp_interface1' from source: play vars 11762 1726853257.57495: variable 'dhcp_interface2' from source: play vars 11762 1726853257.57511: variable 'omit' from source: magic vars 11762 1726853257.57541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853257.57574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853257.57587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853257.57601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853257.57609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853257.57631: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853257.57634: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853257.57637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853257.57708: Set connection var ansible_timeout to 10 11762 1726853257.57712: Set connection var ansible_shell_type to sh 11762 1726853257.57716: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853257.57721: Set connection var ansible_shell_executable to /bin/sh 11762 1726853257.57728: Set connection var ansible_pipelining to False 11762 1726853257.57734: Set connection var ansible_connection to ssh 11762 1726853257.57752: variable 'ansible_shell_executable' from source: unknown 11762 1726853257.57755: variable 'ansible_connection' from source: unknown 11762 1726853257.57758: variable 'ansible_module_compression' from source: unknown 11762 1726853257.57760: variable 'ansible_shell_type' from source: unknown 11762 1726853257.57762: variable 'ansible_shell_executable' from source: unknown 11762 1726853257.57764: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853257.57768: variable 'ansible_pipelining' from source: unknown 11762 1726853257.57772: variable 'ansible_timeout' from source: unknown 11762 1726853257.57777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853257.57876: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853257.57885: variable 'omit' from source: magic vars 11762 1726853257.57890: starting attempt loop 11762 1726853257.57892: running the handler 11762 1726853257.57903: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853257.57917: _low_level_execute_command(): starting 11762 1726853257.57923: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853257.58417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853257.58420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.58423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853257.58425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.58477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853257.58480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853257.58493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853257.58572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853257.60321: stdout chunk (state=3): >>>/root <<< 11762 1726853257.60475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853257.60506: stderr chunk (state=3): >>><<< 11762 1726853257.60509: stdout chunk (state=3): >>><<< 11762 1726853257.60529: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853257.60540: _low_level_execute_command(): starting 11762 1726853257.60549: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040 `" && echo ansible-tmp-1726853257.6052837-12167-274803696756040="` echo /root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040 `" ) && sleep 0' 11762 1726853257.60991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853257.61001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853257.61004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.61006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853257.61008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.61062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853257.61065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853257.61132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853257.63105: stdout chunk (state=3): >>>ansible-tmp-1726853257.6052837-12167-274803696756040=/root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040 <<< 11762 1726853257.63215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853257.63243: stderr chunk (state=3): >>><<< 11762 1726853257.63249: stdout chunk (state=3): >>><<< 11762 1726853257.63262: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853257.6052837-12167-274803696756040=/root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853257.63298: variable 'ansible_module_compression' from source: unknown 11762 1726853257.63336: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853257.63361: variable 'ansible_facts' from source: unknown 11762 1726853257.63412: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040/AnsiballZ_command.py 11762 1726853257.63511: Sending initial data 11762 1726853257.63514: Sent initial data (156 bytes) 11762 1726853257.63964: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853257.63968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853257.63970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853257.63974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.64028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853257.64035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853257.64037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853257.64110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853257.65821: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853257.65886: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853257.65958: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpi1a0wsmx /root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040/AnsiballZ_command.py <<< 11762 1726853257.65964: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040/AnsiballZ_command.py" <<< 11762 1726853257.66027: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpi1a0wsmx" to remote "/root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040/AnsiballZ_command.py" <<< 11762 1726853257.66031: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040/AnsiballZ_command.py" <<< 11762 1726853257.66684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853257.66729: stderr chunk (state=3): >>><<< 11762 1726853257.66732: stdout chunk (state=3): >>><<< 11762 1726853257.66769: done transferring module to remote 11762 1726853257.66780: _low_level_execute_command(): starting 11762 1726853257.66786: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040/ /root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040/AnsiballZ_command.py && sleep 0' 11762 1726853257.67240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853257.67243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853257.67249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.67251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853257.67253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.67305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853257.67308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853257.67313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853257.67386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853257.69249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853257.69276: stderr chunk (state=3): >>><<< 11762 1726853257.69279: stdout chunk (state=3): >>><<< 11762 1726853257.69293: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853257.69301: _low_level_execute_command(): starting 11762 1726853257.69304: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040/AnsiballZ_command.py && sleep 0' 11762 1726853257.69759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853257.69762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.69764: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853257.69766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853257.69818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853257.69825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853257.69827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853257.69900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.09105: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6954 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6954 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:27:37.854775", "end": "2024-09-20 13:27:39.089057", "delta": "0:00:01.234282", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853259.10716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853259.10739: stderr chunk (state=3): >>><<< 11762 1726853259.10742: stdout chunk (state=3): >>><<< 11762 1726853259.10776: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6954 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6954 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:27:37.854775", "end": "2024-09-20 13:27:39.089057", "delta": "0:00:01.234282", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853259.10906: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853259.10911: _low_level_execute_command(): starting 11762 1726853259.10914: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853257.6052837-12167-274803696756040/ > /dev/null 2>&1 && sleep 0' 11762 1726853259.11428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853259.11441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853259.11455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853259.11474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853259.11489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853259.11499: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853259.11589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853259.11610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.11709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.13629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853259.13688: stderr chunk (state=3): >>><<< 11762 1726853259.13698: stdout chunk (state=3): >>><<< 11762 1726853259.13726: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853259.13731: handler run complete 11762 1726853259.13764: Evaluated conditional (False): False 11762 1726853259.13788: attempt loop complete, returning result 11762 1726853259.13791: _execute() done 11762 1726853259.13793: dumping result to json 11762 1726853259.13795: done dumping result, returning 11762 1726853259.13800: done running TaskExecutor() for managed_node2/TASK: Create test interfaces [02083763-bbaf-d845-03d0-000000000115] 11762 1726853259.13805: sending task result for task 02083763-bbaf-d845-03d0-000000000115 11762 1726853259.13908: done sending task result for task 02083763-bbaf-d845-03d0-000000000115 11762 1726853259.13910: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.234282", "end": "2024-09-20 13:27:39.089057", "rc": 0, "start": "2024-09-20 13:27:37.854775" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 6954 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 6954 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11762 1726853259.13992: no more pending results, returning what we have 11762 1726853259.13996: results queue empty 11762 1726853259.13997: checking for any_errors_fatal 11762 1726853259.14004: done checking for any_errors_fatal 11762 1726853259.14004: checking for max_fail_percentage 11762 1726853259.14007: done checking for max_fail_percentage 11762 1726853259.14008: checking to see if all hosts have failed and the running result is not ok 11762 1726853259.14008: done checking to see if all hosts have failed 11762 1726853259.14009: getting the remaining hosts for this loop 11762 1726853259.14011: done getting the remaining hosts for this loop 11762 1726853259.14014: getting the next task for host managed_node2 11762 1726853259.14025: done getting next task for host managed_node2 11762 1726853259.14028: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11762 1726853259.14031: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853259.14035: getting variables 11762 1726853259.14036: in VariableManager get_vars() 11762 1726853259.14064: Calling all_inventory to load vars for managed_node2 11762 1726853259.14067: Calling groups_inventory to load vars for managed_node2 11762 1726853259.14070: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853259.14088: Calling all_plugins_play to load vars for managed_node2 11762 1726853259.14091: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853259.14094: Calling groups_plugins_play to load vars for managed_node2 11762 1726853259.14264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853259.14383: done with get_vars() 11762 1726853259.14391: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:27:39 -0400 (0:00:01.581) 0:00:09.575 ****** 11762 1726853259.14461: entering _queue_task() for managed_node2/include_tasks 11762 1726853259.14652: worker is 1 (out of 1 available) 11762 1726853259.14666: exiting _queue_task() for managed_node2/include_tasks 11762 1726853259.14679: done queuing things up, now waiting for results queue to drain 11762 1726853259.14681: waiting for pending results... 11762 1726853259.14853: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11762 1726853259.14954: in run() - task 02083763-bbaf-d845-03d0-00000000011c 11762 1726853259.15176: variable 'ansible_search_path' from source: unknown 11762 1726853259.15180: variable 'ansible_search_path' from source: unknown 11762 1726853259.15183: calling self._execute() 11762 1726853259.15186: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.15189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.15191: variable 'omit' from source: magic vars 11762 1726853259.15474: variable 'ansible_distribution_major_version' from source: facts 11762 1726853259.15491: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853259.15502: _execute() done 11762 1726853259.15510: dumping result to json 11762 1726853259.15517: done dumping result, returning 11762 1726853259.15528: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-d845-03d0-00000000011c] 11762 1726853259.15544: sending task result for task 02083763-bbaf-d845-03d0-00000000011c 11762 1726853259.15685: no more pending results, returning what we have 11762 1726853259.15689: in VariableManager get_vars() 11762 1726853259.15722: Calling all_inventory to load vars for managed_node2 11762 1726853259.15725: Calling groups_inventory to load vars for managed_node2 11762 1726853259.15729: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853259.15742: Calling all_plugins_play to load vars for managed_node2 11762 1726853259.15746: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853259.15749: Calling groups_plugins_play to load vars for managed_node2 11762 1726853259.16059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853259.16378: done with get_vars() 11762 1726853259.16386: variable 'ansible_search_path' from source: unknown 11762 1726853259.16387: variable 'ansible_search_path' from source: unknown 11762 1726853259.16401: done sending task result for task 02083763-bbaf-d845-03d0-00000000011c 11762 1726853259.16403: WORKER PROCESS EXITING 11762 1726853259.16435: we have included files to process 11762 1726853259.16436: generating all_blocks data 11762 1726853259.16438: done generating all_blocks data 11762 1726853259.16444: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853259.16445: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853259.16447: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853259.16677: done processing included file 11762 1726853259.16680: iterating over new_blocks loaded from include file 11762 1726853259.16681: in VariableManager get_vars() 11762 1726853259.16696: done with get_vars() 11762 1726853259.16697: filtering new block on tags 11762 1726853259.16727: done filtering new block on tags 11762 1726853259.16729: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11762 1726853259.16733: extending task lists for all hosts with included blocks 11762 1726853259.17052: done extending task lists 11762 1726853259.17053: done processing included files 11762 1726853259.17054: results queue empty 11762 1726853259.17055: checking for any_errors_fatal 11762 1726853259.17060: done checking for any_errors_fatal 11762 1726853259.17060: checking for max_fail_percentage 11762 1726853259.17062: done checking for max_fail_percentage 11762 1726853259.17062: checking to see if all hosts have failed and the running result is not ok 11762 1726853259.17063: done checking to see if all hosts have failed 11762 1726853259.17064: getting the remaining hosts for this loop 11762 1726853259.17065: done getting the remaining hosts for this loop 11762 1726853259.17073: getting the next task for host managed_node2 11762 1726853259.17078: done getting next task for host managed_node2 11762 1726853259.17081: ^ task is: TASK: Get stat for interface {{ interface }} 11762 1726853259.17084: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853259.17086: getting variables 11762 1726853259.17087: in VariableManager get_vars() 11762 1726853259.17095: Calling all_inventory to load vars for managed_node2 11762 1726853259.17097: Calling groups_inventory to load vars for managed_node2 11762 1726853259.17100: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853259.17105: Calling all_plugins_play to load vars for managed_node2 11762 1726853259.17107: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853259.17110: Calling groups_plugins_play to load vars for managed_node2 11762 1726853259.17247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853259.17445: done with get_vars() 11762 1726853259.17454: done getting variables 11762 1726853259.17611: variable 'interface' from source: task vars 11762 1726853259.17615: variable 'dhcp_interface1' from source: play vars 11762 1726853259.17677: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:27:39 -0400 (0:00:00.032) 0:00:09.607 ****** 11762 1726853259.17707: entering _queue_task() for managed_node2/stat 11762 1726853259.18189: worker is 1 (out of 1 available) 11762 1726853259.18197: exiting _queue_task() for managed_node2/stat 11762 1726853259.18208: done queuing things up, now waiting for results queue to drain 11762 1726853259.18209: waiting for pending results... 11762 1726853259.18234: running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 11762 1726853259.18366: in run() - task 02083763-bbaf-d845-03d0-00000000017b 11762 1726853259.18387: variable 'ansible_search_path' from source: unknown 11762 1726853259.18394: variable 'ansible_search_path' from source: unknown 11762 1726853259.18436: calling self._execute() 11762 1726853259.18509: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.18520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.18531: variable 'omit' from source: magic vars 11762 1726853259.19206: variable 'ansible_distribution_major_version' from source: facts 11762 1726853259.19221: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853259.19231: variable 'omit' from source: magic vars 11762 1726853259.19476: variable 'omit' from source: magic vars 11762 1726853259.19563: variable 'interface' from source: task vars 11762 1726853259.19613: variable 'dhcp_interface1' from source: play vars 11762 1726853259.19684: variable 'dhcp_interface1' from source: play vars 11762 1726853259.19799: variable 'omit' from source: magic vars 11762 1726853259.19851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853259.19894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853259.19943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853259.19975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853259.19993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853259.20026: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853259.20034: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.20041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.20173: Set connection var ansible_timeout to 10 11762 1726853259.20176: Set connection var ansible_shell_type to sh 11762 1726853259.20178: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853259.20180: Set connection var ansible_shell_executable to /bin/sh 11762 1726853259.20202: Set connection var ansible_pipelining to False 11762 1726853259.20462: Set connection var ansible_connection to ssh 11762 1726853259.20465: variable 'ansible_shell_executable' from source: unknown 11762 1726853259.20468: variable 'ansible_connection' from source: unknown 11762 1726853259.20472: variable 'ansible_module_compression' from source: unknown 11762 1726853259.20475: variable 'ansible_shell_type' from source: unknown 11762 1726853259.20477: variable 'ansible_shell_executable' from source: unknown 11762 1726853259.20478: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.20480: variable 'ansible_pipelining' from source: unknown 11762 1726853259.20483: variable 'ansible_timeout' from source: unknown 11762 1726853259.20485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.20649: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853259.20669: variable 'omit' from source: magic vars 11762 1726853259.20683: starting attempt loop 11762 1726853259.20690: running the handler 11762 1726853259.20706: _low_level_execute_command(): starting 11762 1726853259.20718: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853259.21401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853259.21413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853259.21429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853259.21492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853259.21551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853259.21568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853259.21597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.21766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.23504: stdout chunk (state=3): >>>/root <<< 11762 1726853259.23680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853259.23691: stderr chunk (state=3): >>><<< 11762 1726853259.23709: stdout chunk (state=3): >>><<< 11762 1726853259.23729: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853259.23830: _low_level_execute_command(): starting 11762 1726853259.23834: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433 `" && echo ansible-tmp-1726853259.2373607-12213-128744551289433="` echo /root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433 `" ) && sleep 0' 11762 1726853259.24966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853259.25106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.25193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.27179: stdout chunk (state=3): >>>ansible-tmp-1726853259.2373607-12213-128744551289433=/root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433 <<< 11762 1726853259.27339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853259.27353: stdout chunk (state=3): >>><<< 11762 1726853259.27388: stderr chunk (state=3): >>><<< 11762 1726853259.27452: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853259.2373607-12213-128744551289433=/root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853259.27507: variable 'ansible_module_compression' from source: unknown 11762 1726853259.27578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11762 1726853259.27770: variable 'ansible_facts' from source: unknown 11762 1726853259.27820: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433/AnsiballZ_stat.py 11762 1726853259.28000: Sending initial data 11762 1726853259.28008: Sent initial data (153 bytes) 11762 1726853259.28617: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853259.28635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853259.28654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853259.28762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853259.28785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853259.28800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.28903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.30683: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853259.30782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853259.30861: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpv32hp_w3 /root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433/AnsiballZ_stat.py <<< 11762 1726853259.30880: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433/AnsiballZ_stat.py" <<< 11762 1726853259.30933: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpv32hp_w3" to remote "/root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433/AnsiballZ_stat.py" <<< 11762 1726853259.31981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853259.31984: stdout chunk (state=3): >>><<< 11762 1726853259.31987: stderr chunk (state=3): >>><<< 11762 1726853259.32042: done transferring module to remote 11762 1726853259.32137: _low_level_execute_command(): starting 11762 1726853259.32141: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433/ /root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433/AnsiballZ_stat.py && sleep 0' 11762 1726853259.32736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853259.32751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853259.32785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853259.32806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853259.32901: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853259.32936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.33036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.35024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853259.35028: stdout chunk (state=3): >>><<< 11762 1726853259.35031: stderr chunk (state=3): >>><<< 11762 1726853259.35140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853259.35143: _low_level_execute_command(): starting 11762 1726853259.35146: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433/AnsiballZ_stat.py && sleep 0' 11762 1726853259.35763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853259.35788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853259.35803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.35911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.51481: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26376, "dev": 23, "nlink": 1, "atime": 1726853257.8616068, "mtime": 1726853257.8616068, "ctime": 1726853257.8616068, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11762 1726853259.53138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853259.53142: stdout chunk (state=3): >>><<< 11762 1726853259.53148: stderr chunk (state=3): >>><<< 11762 1726853259.53167: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26376, "dev": 23, "nlink": 1, "atime": 1726853257.8616068, "mtime": 1726853257.8616068, "ctime": 1726853257.8616068, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853259.53227: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853259.53577: _low_level_execute_command(): starting 11762 1726853259.53581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853259.2373607-12213-128744551289433/ > /dev/null 2>&1 && sleep 0' 11762 1726853259.55213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853259.55289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853259.55447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853259.55460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.55558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.57552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853259.57588: stderr chunk (state=3): >>><<< 11762 1726853259.57779: stdout chunk (state=3): >>><<< 11762 1726853259.57783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853259.57786: handler run complete 11762 1726853259.57788: attempt loop complete, returning result 11762 1726853259.57790: _execute() done 11762 1726853259.57792: dumping result to json 11762 1726853259.57794: done dumping result, returning 11762 1726853259.57796: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 [02083763-bbaf-d845-03d0-00000000017b] 11762 1726853259.57798: sending task result for task 02083763-bbaf-d845-03d0-00000000017b 11762 1726853259.58008: done sending task result for task 02083763-bbaf-d845-03d0-00000000017b 11762 1726853259.58010: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726853257.8616068, "block_size": 4096, "blocks": 0, "ctime": 1726853257.8616068, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26376, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726853257.8616068, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11762 1726853259.58117: no more pending results, returning what we have 11762 1726853259.58121: results queue empty 11762 1726853259.58122: checking for any_errors_fatal 11762 1726853259.58123: done checking for any_errors_fatal 11762 1726853259.58123: checking for max_fail_percentage 11762 1726853259.58125: done checking for max_fail_percentage 11762 1726853259.58126: checking to see if all hosts have failed and the running result is not ok 11762 1726853259.58127: done checking to see if all hosts have failed 11762 1726853259.58127: getting the remaining hosts for this loop 11762 1726853259.58129: done getting the remaining hosts for this loop 11762 1726853259.58132: getting the next task for host managed_node2 11762 1726853259.58140: done getting next task for host managed_node2 11762 1726853259.58143: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11762 1726853259.58149: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853259.58153: getting variables 11762 1726853259.58154: in VariableManager get_vars() 11762 1726853259.58187: Calling all_inventory to load vars for managed_node2 11762 1726853259.58190: Calling groups_inventory to load vars for managed_node2 11762 1726853259.58193: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853259.58204: Calling all_plugins_play to load vars for managed_node2 11762 1726853259.58207: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853259.58210: Calling groups_plugins_play to load vars for managed_node2 11762 1726853259.59243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853259.59641: done with get_vars() 11762 1726853259.59653: done getting variables 11762 1726853259.59962: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11762 1726853259.60291: variable 'interface' from source: task vars 11762 1726853259.60295: variable 'dhcp_interface1' from source: play vars 11762 1726853259.60357: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:27:39 -0400 (0:00:00.426) 0:00:10.034 ****** 11762 1726853259.60392: entering _queue_task() for managed_node2/assert 11762 1726853259.60394: Creating lock for assert 11762 1726853259.61307: worker is 1 (out of 1 available) 11762 1726853259.61318: exiting _queue_task() for managed_node2/assert 11762 1726853259.61328: done queuing things up, now waiting for results queue to drain 11762 1726853259.61330: waiting for pending results... 11762 1726853259.61529: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' 11762 1726853259.61996: in run() - task 02083763-bbaf-d845-03d0-00000000011d 11762 1726853259.61999: variable 'ansible_search_path' from source: unknown 11762 1726853259.62001: variable 'ansible_search_path' from source: unknown 11762 1726853259.62004: calling self._execute() 11762 1726853259.62106: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.62117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.62129: variable 'omit' from source: magic vars 11762 1726853259.62927: variable 'ansible_distribution_major_version' from source: facts 11762 1726853259.62943: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853259.62957: variable 'omit' from source: magic vars 11762 1726853259.63019: variable 'omit' from source: magic vars 11762 1726853259.63241: variable 'interface' from source: task vars 11762 1726853259.63376: variable 'dhcp_interface1' from source: play vars 11762 1726853259.63437: variable 'dhcp_interface1' from source: play vars 11762 1726853259.63465: variable 'omit' from source: magic vars 11762 1726853259.63777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853259.63781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853259.63784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853259.63803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853259.63817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853259.63851: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853259.63859: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.63866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.63969: Set connection var ansible_timeout to 10 11762 1726853259.64108: Set connection var ansible_shell_type to sh 11762 1726853259.64119: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853259.64128: Set connection var ansible_shell_executable to /bin/sh 11762 1726853259.64140: Set connection var ansible_pipelining to False 11762 1726853259.64153: Set connection var ansible_connection to ssh 11762 1726853259.64181: variable 'ansible_shell_executable' from source: unknown 11762 1726853259.64213: variable 'ansible_connection' from source: unknown 11762 1726853259.64220: variable 'ansible_module_compression' from source: unknown 11762 1726853259.64318: variable 'ansible_shell_type' from source: unknown 11762 1726853259.64322: variable 'ansible_shell_executable' from source: unknown 11762 1726853259.64324: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.64326: variable 'ansible_pipelining' from source: unknown 11762 1726853259.64329: variable 'ansible_timeout' from source: unknown 11762 1726853259.64331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.64574: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853259.64662: variable 'omit' from source: magic vars 11762 1726853259.64675: starting attempt loop 11762 1726853259.64683: running the handler 11762 1726853259.64932: variable 'interface_stat' from source: set_fact 11762 1726853259.64994: Evaluated conditional (interface_stat.stat.exists): True 11762 1726853259.65004: handler run complete 11762 1726853259.65176: attempt loop complete, returning result 11762 1726853259.65181: _execute() done 11762 1726853259.65183: dumping result to json 11762 1726853259.65186: done dumping result, returning 11762 1726853259.65188: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' [02083763-bbaf-d845-03d0-00000000011d] 11762 1726853259.65190: sending task result for task 02083763-bbaf-d845-03d0-00000000011d ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853259.65328: no more pending results, returning what we have 11762 1726853259.65331: results queue empty 11762 1726853259.65332: checking for any_errors_fatal 11762 1726853259.65341: done checking for any_errors_fatal 11762 1726853259.65342: checking for max_fail_percentage 11762 1726853259.65347: done checking for max_fail_percentage 11762 1726853259.65348: checking to see if all hosts have failed and the running result is not ok 11762 1726853259.65349: done checking to see if all hosts have failed 11762 1726853259.65350: getting the remaining hosts for this loop 11762 1726853259.65352: done getting the remaining hosts for this loop 11762 1726853259.65356: getting the next task for host managed_node2 11762 1726853259.65365: done getting next task for host managed_node2 11762 1726853259.65368: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11762 1726853259.65377: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853259.65382: getting variables 11762 1726853259.65384: in VariableManager get_vars() 11762 1726853259.65417: Calling all_inventory to load vars for managed_node2 11762 1726853259.65420: Calling groups_inventory to load vars for managed_node2 11762 1726853259.65425: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853259.65437: Calling all_plugins_play to load vars for managed_node2 11762 1726853259.65441: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853259.65447: Calling groups_plugins_play to load vars for managed_node2 11762 1726853259.65740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853259.66361: done with get_vars() 11762 1726853259.66585: done sending task result for task 02083763-bbaf-d845-03d0-00000000011d 11762 1726853259.66588: WORKER PROCESS EXITING 11762 1726853259.66593: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:27:39 -0400 (0:00:00.063) 0:00:10.097 ****** 11762 1726853259.66701: entering _queue_task() for managed_node2/include_tasks 11762 1726853259.67411: worker is 1 (out of 1 available) 11762 1726853259.67425: exiting _queue_task() for managed_node2/include_tasks 11762 1726853259.67437: done queuing things up, now waiting for results queue to drain 11762 1726853259.67439: waiting for pending results... 11762 1726853259.67670: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11762 1726853259.67988: in run() - task 02083763-bbaf-d845-03d0-000000000121 11762 1726853259.68009: variable 'ansible_search_path' from source: unknown 11762 1726853259.68022: variable 'ansible_search_path' from source: unknown 11762 1726853259.68237: calling self._execute() 11762 1726853259.68264: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.68353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.68368: variable 'omit' from source: magic vars 11762 1726853259.69070: variable 'ansible_distribution_major_version' from source: facts 11762 1726853259.69191: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853259.69203: _execute() done 11762 1726853259.69217: dumping result to json 11762 1726853259.69224: done dumping result, returning 11762 1726853259.69237: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-d845-03d0-000000000121] 11762 1726853259.69249: sending task result for task 02083763-bbaf-d845-03d0-000000000121 11762 1726853259.69387: no more pending results, returning what we have 11762 1726853259.69392: in VariableManager get_vars() 11762 1726853259.69432: Calling all_inventory to load vars for managed_node2 11762 1726853259.69436: Calling groups_inventory to load vars for managed_node2 11762 1726853259.69440: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853259.69456: Calling all_plugins_play to load vars for managed_node2 11762 1726853259.69459: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853259.69462: Calling groups_plugins_play to load vars for managed_node2 11762 1726853259.69977: done sending task result for task 02083763-bbaf-d845-03d0-000000000121 11762 1726853259.69981: WORKER PROCESS EXITING 11762 1726853259.70009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853259.70406: done with get_vars() 11762 1726853259.70415: variable 'ansible_search_path' from source: unknown 11762 1726853259.70416: variable 'ansible_search_path' from source: unknown 11762 1726853259.70568: we have included files to process 11762 1726853259.70569: generating all_blocks data 11762 1726853259.70573: done generating all_blocks data 11762 1726853259.70577: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853259.70578: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853259.70581: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853259.70997: done processing included file 11762 1726853259.70999: iterating over new_blocks loaded from include file 11762 1726853259.71001: in VariableManager get_vars() 11762 1726853259.71016: done with get_vars() 11762 1726853259.71018: filtering new block on tags 11762 1726853259.71051: done filtering new block on tags 11762 1726853259.71053: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11762 1726853259.71059: extending task lists for all hosts with included blocks 11762 1726853259.71584: done extending task lists 11762 1726853259.71586: done processing included files 11762 1726853259.71587: results queue empty 11762 1726853259.71587: checking for any_errors_fatal 11762 1726853259.71590: done checking for any_errors_fatal 11762 1726853259.71591: checking for max_fail_percentage 11762 1726853259.71592: done checking for max_fail_percentage 11762 1726853259.71593: checking to see if all hosts have failed and the running result is not ok 11762 1726853259.71593: done checking to see if all hosts have failed 11762 1726853259.71594: getting the remaining hosts for this loop 11762 1726853259.71595: done getting the remaining hosts for this loop 11762 1726853259.71598: getting the next task for host managed_node2 11762 1726853259.71603: done getting next task for host managed_node2 11762 1726853259.71605: ^ task is: TASK: Get stat for interface {{ interface }} 11762 1726853259.71608: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853259.71611: getting variables 11762 1726853259.71612: in VariableManager get_vars() 11762 1726853259.71734: Calling all_inventory to load vars for managed_node2 11762 1726853259.71737: Calling groups_inventory to load vars for managed_node2 11762 1726853259.71739: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853259.71747: Calling all_plugins_play to load vars for managed_node2 11762 1726853259.71750: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853259.71753: Calling groups_plugins_play to load vars for managed_node2 11762 1726853259.72005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853259.72403: done with get_vars() 11762 1726853259.72412: done getting variables 11762 1726853259.72718: variable 'interface' from source: task vars 11762 1726853259.72776: variable 'dhcp_interface2' from source: play vars 11762 1726853259.72953: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:27:39 -0400 (0:00:00.062) 0:00:10.160 ****** 11762 1726853259.72988: entering _queue_task() for managed_node2/stat 11762 1726853259.73619: worker is 1 (out of 1 available) 11762 1726853259.73632: exiting _queue_task() for managed_node2/stat 11762 1726853259.73648: done queuing things up, now waiting for results queue to drain 11762 1726853259.73649: waiting for pending results... 11762 1726853259.74064: running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 11762 1726853259.74404: in run() - task 02083763-bbaf-d845-03d0-00000000019f 11762 1726853259.74576: variable 'ansible_search_path' from source: unknown 11762 1726853259.74579: variable 'ansible_search_path' from source: unknown 11762 1726853259.74583: calling self._execute() 11762 1726853259.74586: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.74588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.74591: variable 'omit' from source: magic vars 11762 1726853259.75365: variable 'ansible_distribution_major_version' from source: facts 11762 1726853259.75776: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853259.75780: variable 'omit' from source: magic vars 11762 1726853259.75782: variable 'omit' from source: magic vars 11762 1726853259.75784: variable 'interface' from source: task vars 11762 1726853259.75787: variable 'dhcp_interface2' from source: play vars 11762 1726853259.75990: variable 'dhcp_interface2' from source: play vars 11762 1726853259.76014: variable 'omit' from source: magic vars 11762 1726853259.76056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853259.76113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853259.76376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853259.76380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853259.76382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853259.76384: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853259.76387: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.76389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.76473: Set connection var ansible_timeout to 10 11762 1726853259.76779: Set connection var ansible_shell_type to sh 11762 1726853259.76782: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853259.76784: Set connection var ansible_shell_executable to /bin/sh 11762 1726853259.76786: Set connection var ansible_pipelining to False 11762 1726853259.76789: Set connection var ansible_connection to ssh 11762 1726853259.76791: variable 'ansible_shell_executable' from source: unknown 11762 1726853259.76793: variable 'ansible_connection' from source: unknown 11762 1726853259.76794: variable 'ansible_module_compression' from source: unknown 11762 1726853259.76796: variable 'ansible_shell_type' from source: unknown 11762 1726853259.76798: variable 'ansible_shell_executable' from source: unknown 11762 1726853259.76800: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853259.76802: variable 'ansible_pipelining' from source: unknown 11762 1726853259.76804: variable 'ansible_timeout' from source: unknown 11762 1726853259.76806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853259.76970: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853259.77189: variable 'omit' from source: magic vars 11762 1726853259.77198: starting attempt loop 11762 1726853259.77205: running the handler 11762 1726853259.77222: _low_level_execute_command(): starting 11762 1726853259.77236: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853259.78594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853259.78654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853259.78665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853259.78747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.78845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.80615: stdout chunk (state=3): >>>/root <<< 11762 1726853259.80808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853259.80941: stderr chunk (state=3): >>><<< 11762 1726853259.80951: stdout chunk (state=3): >>><<< 11762 1726853259.80988: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853259.81009: _low_level_execute_command(): starting 11762 1726853259.81089: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465 `" && echo ansible-tmp-1726853259.8099592-12235-261565019330465="` echo /root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465 `" ) && sleep 0' 11762 1726853259.82205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853259.82217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853259.82239: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853259.82586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853259.82697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.84732: stdout chunk (state=3): >>>ansible-tmp-1726853259.8099592-12235-261565019330465=/root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465 <<< 11762 1726853259.84907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853259.84982: stderr chunk (state=3): >>><<< 11762 1726853259.84985: stdout chunk (state=3): >>><<< 11762 1726853259.85006: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853259.8099592-12235-261565019330465=/root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853259.85177: variable 'ansible_module_compression' from source: unknown 11762 1726853259.85286: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11762 1726853259.85349: variable 'ansible_facts' from source: unknown 11762 1726853259.85491: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465/AnsiballZ_stat.py 11762 1726853259.86179: Sending initial data 11762 1726853259.86182: Sent initial data (153 bytes) 11762 1726853259.87043: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853259.87060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853259.87074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853259.87237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853259.87311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.87341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.89113: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853259.89181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853259.89249: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpemjgja5z /root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465/AnsiballZ_stat.py <<< 11762 1726853259.89253: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465/AnsiballZ_stat.py" <<< 11762 1726853259.89347: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpemjgja5z" to remote "/root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465/AnsiballZ_stat.py" <<< 11762 1726853259.90868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853259.91012: stderr chunk (state=3): >>><<< 11762 1726853259.91016: stdout chunk (state=3): >>><<< 11762 1726853259.91018: done transferring module to remote 11762 1726853259.91021: _low_level_execute_command(): starting 11762 1726853259.91024: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465/ /root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465/AnsiballZ_stat.py && sleep 0' 11762 1726853259.92388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853259.92518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853259.92543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.92785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853259.94719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853259.94741: stderr chunk (state=3): >>><<< 11762 1726853259.94827: stdout chunk (state=3): >>><<< 11762 1726853259.94843: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853259.94852: _low_level_execute_command(): starting 11762 1726853259.94862: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465/AnsiballZ_stat.py && sleep 0' 11762 1726853259.95912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853259.95929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853259.95948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853259.95969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853259.95989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853259.96009: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853259.96089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853259.96127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853259.96176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853259.96317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853260.12130: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26782, "dev": 23, "nlink": 1, "atime": 1726853257.8681612, "mtime": 1726853257.8681612, "ctime": 1726853257.8681612, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11762 1726853260.13631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853260.13635: stdout chunk (state=3): >>><<< 11762 1726853260.13638: stderr chunk (state=3): >>><<< 11762 1726853260.13784: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26782, "dev": 23, "nlink": 1, "atime": 1726853257.8681612, "mtime": 1726853257.8681612, "ctime": 1726853257.8681612, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853260.13788: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853260.13791: _low_level_execute_command(): starting 11762 1726853260.13793: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853259.8099592-12235-261565019330465/ > /dev/null 2>&1 && sleep 0' 11762 1726853260.14406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853260.14423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853260.14478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853260.14491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853260.14578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853260.14599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853260.14717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853260.16784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853260.16790: stdout chunk (state=3): >>><<< 11762 1726853260.16792: stderr chunk (state=3): >>><<< 11762 1726853260.16795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853260.16797: handler run complete 11762 1726853260.16799: attempt loop complete, returning result 11762 1726853260.16801: _execute() done 11762 1726853260.16821: dumping result to json 11762 1726853260.16832: done dumping result, returning 11762 1726853260.16844: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 [02083763-bbaf-d845-03d0-00000000019f] 11762 1726853260.16853: sending task result for task 02083763-bbaf-d845-03d0-00000000019f 11762 1726853260.17130: done sending task result for task 02083763-bbaf-d845-03d0-00000000019f 11762 1726853260.17134: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726853257.8681612, "block_size": 4096, "blocks": 0, "ctime": 1726853257.8681612, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26782, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726853257.8681612, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11762 1726853260.17330: no more pending results, returning what we have 11762 1726853260.17333: results queue empty 11762 1726853260.17334: checking for any_errors_fatal 11762 1726853260.17335: done checking for any_errors_fatal 11762 1726853260.17336: checking for max_fail_percentage 11762 1726853260.17338: done checking for max_fail_percentage 11762 1726853260.17339: checking to see if all hosts have failed and the running result is not ok 11762 1726853260.17339: done checking to see if all hosts have failed 11762 1726853260.17340: getting the remaining hosts for this loop 11762 1726853260.17342: done getting the remaining hosts for this loop 11762 1726853260.17346: getting the next task for host managed_node2 11762 1726853260.17354: done getting next task for host managed_node2 11762 1726853260.17356: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11762 1726853260.17361: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853260.17365: getting variables 11762 1726853260.17366: in VariableManager get_vars() 11762 1726853260.17508: Calling all_inventory to load vars for managed_node2 11762 1726853260.17511: Calling groups_inventory to load vars for managed_node2 11762 1726853260.17514: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.17523: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.17526: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.17529: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.17846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.18358: done with get_vars() 11762 1726853260.18368: done getting variables 11762 1726853260.18426: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853260.18542: variable 'interface' from source: task vars 11762 1726853260.18548: variable 'dhcp_interface2' from source: play vars 11762 1726853260.18608: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:27:40 -0400 (0:00:00.456) 0:00:10.616 ****** 11762 1726853260.18648: entering _queue_task() for managed_node2/assert 11762 1726853260.18904: worker is 1 (out of 1 available) 11762 1726853260.18916: exiting _queue_task() for managed_node2/assert 11762 1726853260.18928: done queuing things up, now waiting for results queue to drain 11762 1726853260.18930: waiting for pending results... 11762 1726853260.19299: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' 11762 1726853260.19377: in run() - task 02083763-bbaf-d845-03d0-000000000122 11762 1726853260.19381: variable 'ansible_search_path' from source: unknown 11762 1726853260.19384: variable 'ansible_search_path' from source: unknown 11762 1726853260.19387: calling self._execute() 11762 1726853260.19477: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.19491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.19577: variable 'omit' from source: magic vars 11762 1726853260.19878: variable 'ansible_distribution_major_version' from source: facts 11762 1726853260.19894: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853260.19904: variable 'omit' from source: magic vars 11762 1726853260.19974: variable 'omit' from source: magic vars 11762 1726853260.20079: variable 'interface' from source: task vars 11762 1726853260.20089: variable 'dhcp_interface2' from source: play vars 11762 1726853260.20159: variable 'dhcp_interface2' from source: play vars 11762 1726853260.20185: variable 'omit' from source: magic vars 11762 1726853260.20228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853260.20275: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853260.20298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853260.20319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853260.20369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853260.20374: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853260.20380: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.20387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.20490: Set connection var ansible_timeout to 10 11762 1726853260.20498: Set connection var ansible_shell_type to sh 11762 1726853260.20508: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853260.20517: Set connection var ansible_shell_executable to /bin/sh 11762 1726853260.20527: Set connection var ansible_pipelining to False 11762 1726853260.20575: Set connection var ansible_connection to ssh 11762 1726853260.20578: variable 'ansible_shell_executable' from source: unknown 11762 1726853260.20581: variable 'ansible_connection' from source: unknown 11762 1726853260.20583: variable 'ansible_module_compression' from source: unknown 11762 1726853260.20585: variable 'ansible_shell_type' from source: unknown 11762 1726853260.20587: variable 'ansible_shell_executable' from source: unknown 11762 1726853260.20592: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.20775: variable 'ansible_pipelining' from source: unknown 11762 1726853260.20779: variable 'ansible_timeout' from source: unknown 11762 1726853260.20781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.20784: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853260.20786: variable 'omit' from source: magic vars 11762 1726853260.20788: starting attempt loop 11762 1726853260.20790: running the handler 11762 1726853260.20912: variable 'interface_stat' from source: set_fact 11762 1726853260.20937: Evaluated conditional (interface_stat.stat.exists): True 11762 1726853260.20950: handler run complete 11762 1726853260.20967: attempt loop complete, returning result 11762 1726853260.20975: _execute() done 11762 1726853260.20982: dumping result to json 11762 1726853260.20988: done dumping result, returning 11762 1726853260.20999: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' [02083763-bbaf-d845-03d0-000000000122] 11762 1726853260.21012: sending task result for task 02083763-bbaf-d845-03d0-000000000122 11762 1726853260.21118: done sending task result for task 02083763-bbaf-d845-03d0-000000000122 11762 1726853260.21279: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853260.21328: no more pending results, returning what we have 11762 1726853260.21331: results queue empty 11762 1726853260.21332: checking for any_errors_fatal 11762 1726853260.21341: done checking for any_errors_fatal 11762 1726853260.21342: checking for max_fail_percentage 11762 1726853260.21346: done checking for max_fail_percentage 11762 1726853260.21347: checking to see if all hosts have failed and the running result is not ok 11762 1726853260.21348: done checking to see if all hosts have failed 11762 1726853260.21349: getting the remaining hosts for this loop 11762 1726853260.21351: done getting the remaining hosts for this loop 11762 1726853260.21354: getting the next task for host managed_node2 11762 1726853260.21363: done getting next task for host managed_node2 11762 1726853260.21366: ^ task is: TASK: Test 11762 1726853260.21369: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853260.21377: getting variables 11762 1726853260.21378: in VariableManager get_vars() 11762 1726853260.21408: Calling all_inventory to load vars for managed_node2 11762 1726853260.21411: Calling groups_inventory to load vars for managed_node2 11762 1726853260.21415: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.21427: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.21431: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.21434: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.21718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.21954: done with get_vars() 11762 1726853260.21966: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 13:27:40 -0400 (0:00:00.034) 0:00:10.651 ****** 11762 1726853260.22066: entering _queue_task() for managed_node2/include_tasks 11762 1726853260.22357: worker is 1 (out of 1 available) 11762 1726853260.22576: exiting _queue_task() for managed_node2/include_tasks 11762 1726853260.22589: done queuing things up, now waiting for results queue to drain 11762 1726853260.22590: waiting for pending results... 11762 1726853260.22660: running TaskExecutor() for managed_node2/TASK: Test 11762 1726853260.22770: in run() - task 02083763-bbaf-d845-03d0-00000000008c 11762 1726853260.22795: variable 'ansible_search_path' from source: unknown 11762 1726853260.22803: variable 'ansible_search_path' from source: unknown 11762 1726853260.22858: variable 'lsr_test' from source: include params 11762 1726853260.23069: variable 'lsr_test' from source: include params 11762 1726853260.23147: variable 'omit' from source: magic vars 11762 1726853260.23269: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.23286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.23300: variable 'omit' from source: magic vars 11762 1726853260.23578: variable 'ansible_distribution_major_version' from source: facts 11762 1726853260.23581: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853260.23584: variable 'item' from source: unknown 11762 1726853260.23625: variable 'item' from source: unknown 11762 1726853260.23663: variable 'item' from source: unknown 11762 1726853260.23729: variable 'item' from source: unknown 11762 1726853260.24052: dumping result to json 11762 1726853260.24055: done dumping result, returning 11762 1726853260.24058: done running TaskExecutor() for managed_node2/TASK: Test [02083763-bbaf-d845-03d0-00000000008c] 11762 1726853260.24060: sending task result for task 02083763-bbaf-d845-03d0-00000000008c 11762 1726853260.24105: done sending task result for task 02083763-bbaf-d845-03d0-00000000008c 11762 1726853260.24108: WORKER PROCESS EXITING 11762 1726853260.24131: no more pending results, returning what we have 11762 1726853260.24136: in VariableManager get_vars() 11762 1726853260.24177: Calling all_inventory to load vars for managed_node2 11762 1726853260.24180: Calling groups_inventory to load vars for managed_node2 11762 1726853260.24184: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.24198: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.24201: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.24204: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.24514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.24702: done with get_vars() 11762 1726853260.24711: variable 'ansible_search_path' from source: unknown 11762 1726853260.24712: variable 'ansible_search_path' from source: unknown 11762 1726853260.24753: we have included files to process 11762 1726853260.24754: generating all_blocks data 11762 1726853260.24756: done generating all_blocks data 11762 1726853260.24760: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 11762 1726853260.24761: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 11762 1726853260.24764: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 11762 1726853260.25188: done processing included file 11762 1726853260.25190: iterating over new_blocks loaded from include file 11762 1726853260.25191: in VariableManager get_vars() 11762 1726853260.25203: done with get_vars() 11762 1726853260.25205: filtering new block on tags 11762 1726853260.25235: done filtering new block on tags 11762 1726853260.25237: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml for managed_node2 => (item=tasks/create_bond_profile.yml) 11762 1726853260.25242: extending task lists for all hosts with included blocks 11762 1726853260.26508: done extending task lists 11762 1726853260.26510: done processing included files 11762 1726853260.26511: results queue empty 11762 1726853260.26511: checking for any_errors_fatal 11762 1726853260.26516: done checking for any_errors_fatal 11762 1726853260.26516: checking for max_fail_percentage 11762 1726853260.26518: done checking for max_fail_percentage 11762 1726853260.26518: checking to see if all hosts have failed and the running result is not ok 11762 1726853260.26519: done checking to see if all hosts have failed 11762 1726853260.26520: getting the remaining hosts for this loop 11762 1726853260.26521: done getting the remaining hosts for this loop 11762 1726853260.26524: getting the next task for host managed_node2 11762 1726853260.26528: done getting next task for host managed_node2 11762 1726853260.26530: ^ task is: TASK: Include network role 11762 1726853260.26533: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853260.26536: getting variables 11762 1726853260.26536: in VariableManager get_vars() 11762 1726853260.26550: Calling all_inventory to load vars for managed_node2 11762 1726853260.26552: Calling groups_inventory to load vars for managed_node2 11762 1726853260.26555: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.26561: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.26564: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.26567: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.26717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.26909: done with get_vars() 11762 1726853260.26919: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:3 Friday 20 September 2024 13:27:40 -0400 (0:00:00.049) 0:00:10.700 ****** 11762 1726853260.27000: entering _queue_task() for managed_node2/include_role 11762 1726853260.27002: Creating lock for include_role 11762 1726853260.27329: worker is 1 (out of 1 available) 11762 1726853260.27342: exiting _queue_task() for managed_node2/include_role 11762 1726853260.27359: done queuing things up, now waiting for results queue to drain 11762 1726853260.27361: waiting for pending results... 11762 1726853260.27791: running TaskExecutor() for managed_node2/TASK: Include network role 11762 1726853260.27797: in run() - task 02083763-bbaf-d845-03d0-0000000001c5 11762 1726853260.27800: variable 'ansible_search_path' from source: unknown 11762 1726853260.27802: variable 'ansible_search_path' from source: unknown 11762 1726853260.27824: calling self._execute() 11762 1726853260.27904: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.27921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.27934: variable 'omit' from source: magic vars 11762 1726853260.28316: variable 'ansible_distribution_major_version' from source: facts 11762 1726853260.28333: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853260.28349: _execute() done 11762 1726853260.28361: dumping result to json 11762 1726853260.28369: done dumping result, returning 11762 1726853260.28466: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-d845-03d0-0000000001c5] 11762 1726853260.28469: sending task result for task 02083763-bbaf-d845-03d0-0000000001c5 11762 1726853260.28790: done sending task result for task 02083763-bbaf-d845-03d0-0000000001c5 11762 1726853260.28793: WORKER PROCESS EXITING 11762 1726853260.28817: no more pending results, returning what we have 11762 1726853260.28821: in VariableManager get_vars() 11762 1726853260.28852: Calling all_inventory to load vars for managed_node2 11762 1726853260.28855: Calling groups_inventory to load vars for managed_node2 11762 1726853260.28858: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.28869: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.28874: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.28878: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.29127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.29319: done with get_vars() 11762 1726853260.29327: variable 'ansible_search_path' from source: unknown 11762 1726853260.29328: variable 'ansible_search_path' from source: unknown 11762 1726853260.29518: variable 'omit' from source: magic vars 11762 1726853260.29560: variable 'omit' from source: magic vars 11762 1726853260.29577: variable 'omit' from source: magic vars 11762 1726853260.29582: we have included files to process 11762 1726853260.29583: generating all_blocks data 11762 1726853260.29584: done generating all_blocks data 11762 1726853260.29586: processing included file: fedora.linux_system_roles.network 11762 1726853260.29605: in VariableManager get_vars() 11762 1726853260.29617: done with get_vars() 11762 1726853260.29687: in VariableManager get_vars() 11762 1726853260.29703: done with get_vars() 11762 1726853260.29751: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11762 1726853260.30008: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11762 1726853260.30142: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11762 1726853260.30829: in VariableManager get_vars() 11762 1726853260.30854: done with get_vars() 11762 1726853260.31297: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11762 1726853260.33019: iterating over new_blocks loaded from include file 11762 1726853260.33022: in VariableManager get_vars() 11762 1726853260.33041: done with get_vars() 11762 1726853260.33043: filtering new block on tags 11762 1726853260.33341: done filtering new block on tags 11762 1726853260.33347: in VariableManager get_vars() 11762 1726853260.33363: done with get_vars() 11762 1726853260.33365: filtering new block on tags 11762 1726853260.33383: done filtering new block on tags 11762 1726853260.33385: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 11762 1726853260.33391: extending task lists for all hosts with included blocks 11762 1726853260.33556: done extending task lists 11762 1726853260.33558: done processing included files 11762 1726853260.33559: results queue empty 11762 1726853260.33559: checking for any_errors_fatal 11762 1726853260.33563: done checking for any_errors_fatal 11762 1726853260.33564: checking for max_fail_percentage 11762 1726853260.33565: done checking for max_fail_percentage 11762 1726853260.33566: checking to see if all hosts have failed and the running result is not ok 11762 1726853260.33567: done checking to see if all hosts have failed 11762 1726853260.33567: getting the remaining hosts for this loop 11762 1726853260.33568: done getting the remaining hosts for this loop 11762 1726853260.33572: getting the next task for host managed_node2 11762 1726853260.33577: done getting next task for host managed_node2 11762 1726853260.33579: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11762 1726853260.33583: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853260.33592: getting variables 11762 1726853260.33593: in VariableManager get_vars() 11762 1726853260.33606: Calling all_inventory to load vars for managed_node2 11762 1726853260.33608: Calling groups_inventory to load vars for managed_node2 11762 1726853260.33610: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.33616: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.33618: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.33621: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.33955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.34148: done with get_vars() 11762 1726853260.34158: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:27:40 -0400 (0:00:00.072) 0:00:10.772 ****** 11762 1726853260.34235: entering _queue_task() for managed_node2/include_tasks 11762 1726853260.34554: worker is 1 (out of 1 available) 11762 1726853260.34567: exiting _queue_task() for managed_node2/include_tasks 11762 1726853260.34781: done queuing things up, now waiting for results queue to drain 11762 1726853260.34783: waiting for pending results... 11762 1726853260.34854: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11762 1726853260.35000: in run() - task 02083763-bbaf-d845-03d0-000000000277 11762 1726853260.35026: variable 'ansible_search_path' from source: unknown 11762 1726853260.35034: variable 'ansible_search_path' from source: unknown 11762 1726853260.35079: calling self._execute() 11762 1726853260.35225: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.35229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.35233: variable 'omit' from source: magic vars 11762 1726853260.35579: variable 'ansible_distribution_major_version' from source: facts 11762 1726853260.35596: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853260.35607: _execute() done 11762 1726853260.35615: dumping result to json 11762 1726853260.35623: done dumping result, returning 11762 1726853260.35634: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-d845-03d0-000000000277] 11762 1726853260.35648: sending task result for task 02083763-bbaf-d845-03d0-000000000277 11762 1726853260.35912: no more pending results, returning what we have 11762 1726853260.35917: in VariableManager get_vars() 11762 1726853260.35965: Calling all_inventory to load vars for managed_node2 11762 1726853260.35968: Calling groups_inventory to load vars for managed_node2 11762 1726853260.35973: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.35986: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.35989: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.35992: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.36321: done sending task result for task 02083763-bbaf-d845-03d0-000000000277 11762 1726853260.36324: WORKER PROCESS EXITING 11762 1726853260.36350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.36593: done with get_vars() 11762 1726853260.36601: variable 'ansible_search_path' from source: unknown 11762 1726853260.36602: variable 'ansible_search_path' from source: unknown 11762 1726853260.36640: we have included files to process 11762 1726853260.36641: generating all_blocks data 11762 1726853260.36643: done generating all_blocks data 11762 1726853260.36650: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853260.36651: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853260.36654: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853260.37336: done processing included file 11762 1726853260.37338: iterating over new_blocks loaded from include file 11762 1726853260.37339: in VariableManager get_vars() 11762 1726853260.37366: done with get_vars() 11762 1726853260.37368: filtering new block on tags 11762 1726853260.37400: done filtering new block on tags 11762 1726853260.37403: in VariableManager get_vars() 11762 1726853260.37427: done with get_vars() 11762 1726853260.37429: filtering new block on tags 11762 1726853260.37476: done filtering new block on tags 11762 1726853260.37479: in VariableManager get_vars() 11762 1726853260.37500: done with get_vars() 11762 1726853260.37502: filtering new block on tags 11762 1726853260.37543: done filtering new block on tags 11762 1726853260.37548: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 11762 1726853260.37557: extending task lists for all hosts with included blocks 11762 1726853260.38518: done extending task lists 11762 1726853260.38519: done processing included files 11762 1726853260.38519: results queue empty 11762 1726853260.38520: checking for any_errors_fatal 11762 1726853260.38522: done checking for any_errors_fatal 11762 1726853260.38523: checking for max_fail_percentage 11762 1726853260.38524: done checking for max_fail_percentage 11762 1726853260.38524: checking to see if all hosts have failed and the running result is not ok 11762 1726853260.38525: done checking to see if all hosts have failed 11762 1726853260.38525: getting the remaining hosts for this loop 11762 1726853260.38526: done getting the remaining hosts for this loop 11762 1726853260.38528: getting the next task for host managed_node2 11762 1726853260.38532: done getting next task for host managed_node2 11762 1726853260.38534: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11762 1726853260.38537: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853260.38546: getting variables 11762 1726853260.38547: in VariableManager get_vars() 11762 1726853260.38557: Calling all_inventory to load vars for managed_node2 11762 1726853260.38558: Calling groups_inventory to load vars for managed_node2 11762 1726853260.38559: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.38564: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.38565: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.38567: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.38656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.38778: done with get_vars() 11762 1726853260.38785: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:27:40 -0400 (0:00:00.045) 0:00:10.818 ****** 11762 1726853260.38834: entering _queue_task() for managed_node2/setup 11762 1726853260.39075: worker is 1 (out of 1 available) 11762 1726853260.39089: exiting _queue_task() for managed_node2/setup 11762 1726853260.39101: done queuing things up, now waiting for results queue to drain 11762 1726853260.39103: waiting for pending results... 11762 1726853260.39266: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11762 1726853260.39384: in run() - task 02083763-bbaf-d845-03d0-0000000002d4 11762 1726853260.39402: variable 'ansible_search_path' from source: unknown 11762 1726853260.39410: variable 'ansible_search_path' from source: unknown 11762 1726853260.39674: calling self._execute() 11762 1726853260.39677: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.39680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.39683: variable 'omit' from source: magic vars 11762 1726853260.39954: variable 'ansible_distribution_major_version' from source: facts 11762 1726853260.39974: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853260.40252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853260.41860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853260.41915: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853260.41945: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853260.41973: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853260.41993: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853260.42058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853260.42079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853260.42097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853260.42122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853260.42133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853260.42178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853260.42195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853260.42211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853260.42236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853260.42249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853260.42361: variable '__network_required_facts' from source: role '' defaults 11762 1726853260.42374: variable 'ansible_facts' from source: unknown 11762 1726853260.42429: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11762 1726853260.42433: when evaluation is False, skipping this task 11762 1726853260.42436: _execute() done 11762 1726853260.42438: dumping result to json 11762 1726853260.42441: done dumping result, returning 11762 1726853260.42448: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-d845-03d0-0000000002d4] 11762 1726853260.42453: sending task result for task 02083763-bbaf-d845-03d0-0000000002d4 11762 1726853260.42550: done sending task result for task 02083763-bbaf-d845-03d0-0000000002d4 11762 1726853260.42553: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853260.42633: no more pending results, returning what we have 11762 1726853260.42637: results queue empty 11762 1726853260.42637: checking for any_errors_fatal 11762 1726853260.42639: done checking for any_errors_fatal 11762 1726853260.42640: checking for max_fail_percentage 11762 1726853260.42642: done checking for max_fail_percentage 11762 1726853260.42642: checking to see if all hosts have failed and the running result is not ok 11762 1726853260.42643: done checking to see if all hosts have failed 11762 1726853260.42644: getting the remaining hosts for this loop 11762 1726853260.42645: done getting the remaining hosts for this loop 11762 1726853260.42649: getting the next task for host managed_node2 11762 1726853260.42659: done getting next task for host managed_node2 11762 1726853260.42662: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11762 1726853260.42668: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853260.42684: getting variables 11762 1726853260.42686: in VariableManager get_vars() 11762 1726853260.42725: Calling all_inventory to load vars for managed_node2 11762 1726853260.42728: Calling groups_inventory to load vars for managed_node2 11762 1726853260.42730: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.42738: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.42740: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.42749: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.43095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.43321: done with get_vars() 11762 1726853260.43341: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:27:40 -0400 (0:00:00.046) 0:00:10.864 ****** 11762 1726853260.43449: entering _queue_task() for managed_node2/stat 11762 1726853260.43747: worker is 1 (out of 1 available) 11762 1726853260.43762: exiting _queue_task() for managed_node2/stat 11762 1726853260.43889: done queuing things up, now waiting for results queue to drain 11762 1726853260.43892: waiting for pending results... 11762 1726853260.44203: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 11762 1726853260.44260: in run() - task 02083763-bbaf-d845-03d0-0000000002d6 11762 1726853260.44289: variable 'ansible_search_path' from source: unknown 11762 1726853260.44315: variable 'ansible_search_path' from source: unknown 11762 1726853260.44346: calling self._execute() 11762 1726853260.44506: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.44510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.44513: variable 'omit' from source: magic vars 11762 1726853260.44892: variable 'ansible_distribution_major_version' from source: facts 11762 1726853260.44910: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853260.45095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853260.45391: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853260.45445: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853260.45493: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853260.45535: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853260.45665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853260.45706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853260.45815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853260.45818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853260.45874: variable '__network_is_ostree' from source: set_fact 11762 1726853260.45887: Evaluated conditional (not __network_is_ostree is defined): False 11762 1726853260.45895: when evaluation is False, skipping this task 11762 1726853260.45902: _execute() done 11762 1726853260.45909: dumping result to json 11762 1726853260.45920: done dumping result, returning 11762 1726853260.45935: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-d845-03d0-0000000002d6] 11762 1726853260.46031: sending task result for task 02083763-bbaf-d845-03d0-0000000002d6 11762 1726853260.46109: done sending task result for task 02083763-bbaf-d845-03d0-0000000002d6 11762 1726853260.46112: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11762 1726853260.46188: no more pending results, returning what we have 11762 1726853260.46193: results queue empty 11762 1726853260.46194: checking for any_errors_fatal 11762 1726853260.46199: done checking for any_errors_fatal 11762 1726853260.46200: checking for max_fail_percentage 11762 1726853260.46202: done checking for max_fail_percentage 11762 1726853260.46203: checking to see if all hosts have failed and the running result is not ok 11762 1726853260.46204: done checking to see if all hosts have failed 11762 1726853260.46205: getting the remaining hosts for this loop 11762 1726853260.46207: done getting the remaining hosts for this loop 11762 1726853260.46211: getting the next task for host managed_node2 11762 1726853260.46218: done getting next task for host managed_node2 11762 1726853260.46276: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11762 1726853260.46283: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853260.46297: getting variables 11762 1726853260.46298: in VariableManager get_vars() 11762 1726853260.46394: Calling all_inventory to load vars for managed_node2 11762 1726853260.46397: Calling groups_inventory to load vars for managed_node2 11762 1726853260.46400: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.46410: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.46412: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.46415: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.46734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.46972: done with get_vars() 11762 1726853260.46985: done getting variables 11762 1726853260.47053: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:27:40 -0400 (0:00:00.036) 0:00:10.901 ****** 11762 1726853260.47095: entering _queue_task() for managed_node2/set_fact 11762 1726853260.47400: worker is 1 (out of 1 available) 11762 1726853260.47412: exiting _queue_task() for managed_node2/set_fact 11762 1726853260.47426: done queuing things up, now waiting for results queue to drain 11762 1726853260.47427: waiting for pending results... 11762 1726853260.47804: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11762 1726853260.47879: in run() - task 02083763-bbaf-d845-03d0-0000000002d7 11762 1726853260.47884: variable 'ansible_search_path' from source: unknown 11762 1726853260.48008: variable 'ansible_search_path' from source: unknown 11762 1726853260.48013: calling self._execute() 11762 1726853260.48028: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.48038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.48051: variable 'omit' from source: magic vars 11762 1726853260.48516: variable 'ansible_distribution_major_version' from source: facts 11762 1726853260.48533: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853260.48719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853260.49006: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853260.49054: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853260.49102: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853260.49139: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853260.49235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853260.49266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853260.49299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853260.49340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853260.49440: variable '__network_is_ostree' from source: set_fact 11762 1726853260.49452: Evaluated conditional (not __network_is_ostree is defined): False 11762 1726853260.49460: when evaluation is False, skipping this task 11762 1726853260.49467: _execute() done 11762 1726853260.49535: dumping result to json 11762 1726853260.49539: done dumping result, returning 11762 1726853260.49542: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-d845-03d0-0000000002d7] 11762 1726853260.49544: sending task result for task 02083763-bbaf-d845-03d0-0000000002d7 11762 1726853260.49612: done sending task result for task 02083763-bbaf-d845-03d0-0000000002d7 11762 1726853260.49615: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11762 1726853260.49687: no more pending results, returning what we have 11762 1726853260.49691: results queue empty 11762 1726853260.49692: checking for any_errors_fatal 11762 1726853260.49697: done checking for any_errors_fatal 11762 1726853260.49698: checking for max_fail_percentage 11762 1726853260.49699: done checking for max_fail_percentage 11762 1726853260.49701: checking to see if all hosts have failed and the running result is not ok 11762 1726853260.49702: done checking to see if all hosts have failed 11762 1726853260.49702: getting the remaining hosts for this loop 11762 1726853260.49705: done getting the remaining hosts for this loop 11762 1726853260.49708: getting the next task for host managed_node2 11762 1726853260.49719: done getting next task for host managed_node2 11762 1726853260.49723: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11762 1726853260.49729: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853260.49742: getting variables 11762 1726853260.49750: in VariableManager get_vars() 11762 1726853260.49792: Calling all_inventory to load vars for managed_node2 11762 1726853260.49795: Calling groups_inventory to load vars for managed_node2 11762 1726853260.49798: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853260.49808: Calling all_plugins_play to load vars for managed_node2 11762 1726853260.49811: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853260.49814: Calling groups_plugins_play to load vars for managed_node2 11762 1726853260.50339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853260.50561: done with get_vars() 11762 1726853260.50574: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:27:40 -0400 (0:00:00.035) 0:00:10.937 ****** 11762 1726853260.50681: entering _queue_task() for managed_node2/service_facts 11762 1726853260.50683: Creating lock for service_facts 11762 1726853260.51187: worker is 1 (out of 1 available) 11762 1726853260.51195: exiting _queue_task() for managed_node2/service_facts 11762 1726853260.51205: done queuing things up, now waiting for results queue to drain 11762 1726853260.51207: waiting for pending results... 11762 1726853260.51338: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 11762 1726853260.51418: in run() - task 02083763-bbaf-d845-03d0-0000000002d9 11762 1726853260.51448: variable 'ansible_search_path' from source: unknown 11762 1726853260.51458: variable 'ansible_search_path' from source: unknown 11762 1726853260.51503: calling self._execute() 11762 1726853260.51597: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.51609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.51623: variable 'omit' from source: magic vars 11762 1726853260.52033: variable 'ansible_distribution_major_version' from source: facts 11762 1726853260.52070: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853260.52078: variable 'omit' from source: magic vars 11762 1726853260.52149: variable 'omit' from source: magic vars 11762 1726853260.52195: variable 'omit' from source: magic vars 11762 1726853260.52286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853260.52293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853260.52310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853260.52329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853260.52345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853260.52380: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853260.52394: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.52411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.52516: Set connection var ansible_timeout to 10 11762 1726853260.52615: Set connection var ansible_shell_type to sh 11762 1726853260.52618: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853260.52623: Set connection var ansible_shell_executable to /bin/sh 11762 1726853260.52625: Set connection var ansible_pipelining to False 11762 1726853260.52627: Set connection var ansible_connection to ssh 11762 1726853260.52629: variable 'ansible_shell_executable' from source: unknown 11762 1726853260.52630: variable 'ansible_connection' from source: unknown 11762 1726853260.52633: variable 'ansible_module_compression' from source: unknown 11762 1726853260.52634: variable 'ansible_shell_type' from source: unknown 11762 1726853260.52636: variable 'ansible_shell_executable' from source: unknown 11762 1726853260.52638: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853260.52640: variable 'ansible_pipelining' from source: unknown 11762 1726853260.52641: variable 'ansible_timeout' from source: unknown 11762 1726853260.52643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853260.52867: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853260.52874: variable 'omit' from source: magic vars 11762 1726853260.52877: starting attempt loop 11762 1726853260.52879: running the handler 11762 1726853260.52942: _low_level_execute_command(): starting 11762 1726853260.52945: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853260.53723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853260.53796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853260.53835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853260.53880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853260.53957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853260.55732: stdout chunk (state=3): >>>/root <<< 11762 1726853260.56003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853260.56009: stdout chunk (state=3): >>><<< 11762 1726853260.56012: stderr chunk (state=3): >>><<< 11762 1726853260.56136: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853260.56141: _low_level_execute_command(): starting 11762 1726853260.56144: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546 `" && echo ansible-tmp-1726853260.5603876-12273-243946253900546="` echo /root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546 `" ) && sleep 0' 11762 1726853260.56728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853260.56743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853260.56757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853260.56787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853260.56828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853260.56853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853260.56936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853260.56955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853260.56973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853260.57082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853260.59108: stdout chunk (state=3): >>>ansible-tmp-1726853260.5603876-12273-243946253900546=/root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546 <<< 11762 1726853260.59285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853260.59300: stdout chunk (state=3): >>><<< 11762 1726853260.59310: stderr chunk (state=3): >>><<< 11762 1726853260.59329: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853260.5603876-12273-243946253900546=/root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853260.59482: variable 'ansible_module_compression' from source: unknown 11762 1726853260.59485: ANSIBALLZ: Using lock for service_facts 11762 1726853260.59487: ANSIBALLZ: Acquiring lock 11762 1726853260.59489: ANSIBALLZ: Lock acquired: 139956163340112 11762 1726853260.59491: ANSIBALLZ: Creating module 11762 1726853260.72407: ANSIBALLZ: Writing module into payload 11762 1726853260.72513: ANSIBALLZ: Writing module 11762 1726853260.72534: ANSIBALLZ: Renaming module 11762 1726853260.72540: ANSIBALLZ: Done creating module 11762 1726853260.72560: variable 'ansible_facts' from source: unknown 11762 1726853260.72642: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546/AnsiballZ_service_facts.py 11762 1726853260.72884: Sending initial data 11762 1726853260.72888: Sent initial data (162 bytes) 11762 1726853260.73411: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853260.73434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853260.73464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853260.73498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853260.73510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853260.73517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853260.73643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853260.75308: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853260.75380: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853260.75480: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpq18nvu2e /root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546/AnsiballZ_service_facts.py <<< 11762 1726853260.75484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546/AnsiballZ_service_facts.py" <<< 11762 1726853260.75545: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpq18nvu2e" to remote "/root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546/AnsiballZ_service_facts.py" <<< 11762 1726853260.76283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853260.76328: stderr chunk (state=3): >>><<< 11762 1726853260.76331: stdout chunk (state=3): >>><<< 11762 1726853260.76333: done transferring module to remote 11762 1726853260.76337: _low_level_execute_command(): starting 11762 1726853260.76576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546/ /root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546/AnsiballZ_service_facts.py && sleep 0' 11762 1726853260.76952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853260.76963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853260.76979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853260.76995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853260.77007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853260.77015: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853260.77025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853260.77048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853260.77056: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853260.77058: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853260.77150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853260.77153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853260.77307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853260.79183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853260.79212: stderr chunk (state=3): >>><<< 11762 1726853260.79215: stdout chunk (state=3): >>><<< 11762 1726853260.79228: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853260.79230: _low_level_execute_command(): starting 11762 1726853260.79236: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546/AnsiballZ_service_facts.py && sleep 0' 11762 1726853260.79665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853260.79697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853260.79700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853260.79702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853260.79705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853260.79707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853260.79758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853260.79761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853260.79854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853262.48199: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11762 1726853262.49779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853262.49783: stdout chunk (state=3): >>><<< 11762 1726853262.49785: stderr chunk (state=3): >>><<< 11762 1726853262.49898: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853262.50777: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853262.50781: _low_level_execute_command(): starting 11762 1726853262.50783: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853260.5603876-12273-243946253900546/ > /dev/null 2>&1 && sleep 0' 11762 1726853262.51188: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853262.51217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853262.51228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853262.51252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853262.51264: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853262.51303: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853262.51310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853262.51356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853262.51410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853262.51434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853262.51580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853262.53683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853262.53687: stdout chunk (state=3): >>><<< 11762 1726853262.53690: stderr chunk (state=3): >>><<< 11762 1726853262.53692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853262.53694: handler run complete 11762 1726853262.53794: variable 'ansible_facts' from source: unknown 11762 1726853262.53962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853262.54437: variable 'ansible_facts' from source: unknown 11762 1726853262.55532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853262.55741: attempt loop complete, returning result 11762 1726853262.55745: _execute() done 11762 1726853262.55751: dumping result to json 11762 1726853262.55814: done dumping result, returning 11762 1726853262.55825: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-d845-03d0-0000000002d9] 11762 1726853262.55831: sending task result for task 02083763-bbaf-d845-03d0-0000000002d9 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853262.56739: no more pending results, returning what we have 11762 1726853262.56741: results queue empty 11762 1726853262.56742: checking for any_errors_fatal 11762 1726853262.56746: done checking for any_errors_fatal 11762 1726853262.56747: checking for max_fail_percentage 11762 1726853262.56748: done checking for max_fail_percentage 11762 1726853262.56749: checking to see if all hosts have failed and the running result is not ok 11762 1726853262.56750: done checking to see if all hosts have failed 11762 1726853262.56751: getting the remaining hosts for this loop 11762 1726853262.56752: done getting the remaining hosts for this loop 11762 1726853262.56755: getting the next task for host managed_node2 11762 1726853262.56760: done getting next task for host managed_node2 11762 1726853262.56763: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11762 1726853262.56769: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853262.56780: getting variables 11762 1726853262.56781: in VariableManager get_vars() 11762 1726853262.56815: Calling all_inventory to load vars for managed_node2 11762 1726853262.56818: Calling groups_inventory to load vars for managed_node2 11762 1726853262.56821: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853262.56829: done sending task result for task 02083763-bbaf-d845-03d0-0000000002d9 11762 1726853262.56832: WORKER PROCESS EXITING 11762 1726853262.56841: Calling all_plugins_play to load vars for managed_node2 11762 1726853262.56849: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853262.56853: Calling groups_plugins_play to load vars for managed_node2 11762 1726853262.57268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853262.57791: done with get_vars() 11762 1726853262.57808: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:27:42 -0400 (0:00:02.072) 0:00:13.009 ****** 11762 1726853262.57914: entering _queue_task() for managed_node2/package_facts 11762 1726853262.57915: Creating lock for package_facts 11762 1726853262.58500: worker is 1 (out of 1 available) 11762 1726853262.58509: exiting _queue_task() for managed_node2/package_facts 11762 1726853262.58520: done queuing things up, now waiting for results queue to drain 11762 1726853262.58522: waiting for pending results... 11762 1726853262.58757: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 11762 1726853262.58798: in run() - task 02083763-bbaf-d845-03d0-0000000002da 11762 1726853262.58818: variable 'ansible_search_path' from source: unknown 11762 1726853262.58825: variable 'ansible_search_path' from source: unknown 11762 1726853262.58868: calling self._execute() 11762 1726853262.58949: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853262.58965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853262.59074: variable 'omit' from source: magic vars 11762 1726853262.59350: variable 'ansible_distribution_major_version' from source: facts 11762 1726853262.59367: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853262.59379: variable 'omit' from source: magic vars 11762 1726853262.59468: variable 'omit' from source: magic vars 11762 1726853262.59511: variable 'omit' from source: magic vars 11762 1726853262.59559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853262.59599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853262.59631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853262.59656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853262.59673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853262.59706: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853262.59718: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853262.59727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853262.59832: Set connection var ansible_timeout to 10 11762 1726853262.59841: Set connection var ansible_shell_type to sh 11762 1726853262.59855: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853262.59938: Set connection var ansible_shell_executable to /bin/sh 11762 1726853262.59941: Set connection var ansible_pipelining to False 11762 1726853262.59943: Set connection var ansible_connection to ssh 11762 1726853262.59945: variable 'ansible_shell_executable' from source: unknown 11762 1726853262.59948: variable 'ansible_connection' from source: unknown 11762 1726853262.59950: variable 'ansible_module_compression' from source: unknown 11762 1726853262.59952: variable 'ansible_shell_type' from source: unknown 11762 1726853262.59954: variable 'ansible_shell_executable' from source: unknown 11762 1726853262.59956: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853262.59959: variable 'ansible_pipelining' from source: unknown 11762 1726853262.59962: variable 'ansible_timeout' from source: unknown 11762 1726853262.59964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853262.60156: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853262.60185: variable 'omit' from source: magic vars 11762 1726853262.60188: starting attempt loop 11762 1726853262.60190: running the handler 11762 1726853262.60202: _low_level_execute_command(): starting 11762 1726853262.60264: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853262.60951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853262.60985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853262.61032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853262.61096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853262.61128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853262.61142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853262.61259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853262.62982: stdout chunk (state=3): >>>/root <<< 11762 1726853262.63137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853262.63140: stdout chunk (state=3): >>><<< 11762 1726853262.63142: stderr chunk (state=3): >>><<< 11762 1726853262.63162: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853262.63185: _low_level_execute_command(): starting 11762 1726853262.63270: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726 `" && echo ansible-tmp-1726853262.6316996-12333-279676547514726="` echo /root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726 `" ) && sleep 0' 11762 1726853262.63834: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853262.63849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853262.63864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853262.63882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853262.63898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853262.64006: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853262.64034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853262.64137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853262.66105: stdout chunk (state=3): >>>ansible-tmp-1726853262.6316996-12333-279676547514726=/root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726 <<< 11762 1726853262.66277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853262.66281: stdout chunk (state=3): >>><<< 11762 1726853262.66283: stderr chunk (state=3): >>><<< 11762 1726853262.66300: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853262.6316996-12333-279676547514726=/root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853262.66353: variable 'ansible_module_compression' from source: unknown 11762 1726853262.66476: ANSIBALLZ: Using lock for package_facts 11762 1726853262.66481: ANSIBALLZ: Acquiring lock 11762 1726853262.66484: ANSIBALLZ: Lock acquired: 139956161354912 11762 1726853262.66488: ANSIBALLZ: Creating module 11762 1726853263.06628: ANSIBALLZ: Writing module into payload 11762 1726853263.06786: ANSIBALLZ: Writing module 11762 1726853263.06817: ANSIBALLZ: Renaming module 11762 1726853263.06828: ANSIBALLZ: Done creating module 11762 1726853263.06975: variable 'ansible_facts' from source: unknown 11762 1726853263.07092: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726/AnsiballZ_package_facts.py 11762 1726853263.07299: Sending initial data 11762 1726853263.07302: Sent initial data (162 bytes) 11762 1726853263.07897: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853263.07978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853263.08023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853263.08039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853263.08062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853263.08274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853263.09953: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853263.10019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853263.10104: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp2ig7bqoo /root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726/AnsiballZ_package_facts.py <<< 11762 1726853263.10108: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726/AnsiballZ_package_facts.py" <<< 11762 1726853263.10160: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp2ig7bqoo" to remote "/root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726/AnsiballZ_package_facts.py" <<< 11762 1726853263.12135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853263.12139: stdout chunk (state=3): >>><<< 11762 1726853263.12142: stderr chunk (state=3): >>><<< 11762 1726853263.12377: done transferring module to remote 11762 1726853263.12382: _low_level_execute_command(): starting 11762 1726853263.12385: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726/ /root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726/AnsiballZ_package_facts.py && sleep 0' 11762 1726853263.13491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853263.13495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853263.13498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853263.13500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853263.13503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853263.13505: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853263.13512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853263.13514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853263.13517: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853263.13519: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853263.13520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853263.13522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853263.13524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853263.13526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853263.13527: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853263.13529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853263.13663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853263.13674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853263.13678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853263.13711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853263.15631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853263.15635: stdout chunk (state=3): >>><<< 11762 1726853263.15643: stderr chunk (state=3): >>><<< 11762 1726853263.15664: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853263.15667: _low_level_execute_command(): starting 11762 1726853263.15728: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726/AnsiballZ_package_facts.py && sleep 0' 11762 1726853263.16778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853263.16790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853263.16796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853263.16819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853263.16831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853263.17092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853263.63558: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11762 1726853263.63779: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 11762 1726853263.63789: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 11762 1726853263.63794: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 11762 1726853263.63934: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11762 1726853263.63985: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 11762 1726853263.64012: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11762 1726853263.65638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853263.65642: stdout chunk (state=3): >>><<< 11762 1726853263.65654: stderr chunk (state=3): >>><<< 11762 1726853263.65701: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853263.68655: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853263.68659: _low_level_execute_command(): starting 11762 1726853263.68662: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853262.6316996-12333-279676547514726/ > /dev/null 2>&1 && sleep 0' 11762 1726853263.69190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853263.69206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853263.69219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853263.69324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853263.69353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853263.69382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853263.69506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853263.71978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853263.71982: stdout chunk (state=3): >>><<< 11762 1726853263.71984: stderr chunk (state=3): >>><<< 11762 1726853263.71987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853263.71989: handler run complete 11762 1726853263.73761: variable 'ansible_facts' from source: unknown 11762 1726853263.74820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853263.77001: variable 'ansible_facts' from source: unknown 11762 1726853263.77893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853263.78663: attempt loop complete, returning result 11762 1726853263.78674: _execute() done 11762 1726853263.78677: dumping result to json 11762 1726853263.78802: done dumping result, returning 11762 1726853263.78810: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-d845-03d0-0000000002da] 11762 1726853263.78816: sending task result for task 02083763-bbaf-d845-03d0-0000000002da 11762 1726853263.84309: done sending task result for task 02083763-bbaf-d845-03d0-0000000002da 11762 1726853263.84312: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853263.84414: no more pending results, returning what we have 11762 1726853263.84417: results queue empty 11762 1726853263.84418: checking for any_errors_fatal 11762 1726853263.84422: done checking for any_errors_fatal 11762 1726853263.84422: checking for max_fail_percentage 11762 1726853263.84424: done checking for max_fail_percentage 11762 1726853263.84425: checking to see if all hosts have failed and the running result is not ok 11762 1726853263.84426: done checking to see if all hosts have failed 11762 1726853263.84426: getting the remaining hosts for this loop 11762 1726853263.84427: done getting the remaining hosts for this loop 11762 1726853263.84431: getting the next task for host managed_node2 11762 1726853263.84437: done getting next task for host managed_node2 11762 1726853263.84440: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11762 1726853263.84446: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853263.84461: getting variables 11762 1726853263.84463: in VariableManager get_vars() 11762 1726853263.84489: Calling all_inventory to load vars for managed_node2 11762 1726853263.84491: Calling groups_inventory to load vars for managed_node2 11762 1726853263.84493: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853263.84502: Calling all_plugins_play to load vars for managed_node2 11762 1726853263.84504: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853263.84507: Calling groups_plugins_play to load vars for managed_node2 11762 1726853263.85364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853263.86218: done with get_vars() 11762 1726853263.86233: done getting variables 11762 1726853263.86277: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:27:43 -0400 (0:00:01.283) 0:00:14.293 ****** 11762 1726853263.86308: entering _queue_task() for managed_node2/debug 11762 1726853263.86532: worker is 1 (out of 1 available) 11762 1726853263.86545: exiting _queue_task() for managed_node2/debug 11762 1726853263.86557: done queuing things up, now waiting for results queue to drain 11762 1726853263.86559: waiting for pending results... 11762 1726853263.86990: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 11762 1726853263.86995: in run() - task 02083763-bbaf-d845-03d0-000000000278 11762 1726853263.86998: variable 'ansible_search_path' from source: unknown 11762 1726853263.87000: variable 'ansible_search_path' from source: unknown 11762 1726853263.87007: calling self._execute() 11762 1726853263.87098: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853263.87115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853263.87132: variable 'omit' from source: magic vars 11762 1726853263.87477: variable 'ansible_distribution_major_version' from source: facts 11762 1726853263.87502: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853263.87505: variable 'omit' from source: magic vars 11762 1726853263.87540: variable 'omit' from source: magic vars 11762 1726853263.87876: variable 'network_provider' from source: set_fact 11762 1726853263.87879: variable 'omit' from source: magic vars 11762 1726853263.87883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853263.87885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853263.87888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853263.87890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853263.87892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853263.87894: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853263.87897: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853263.87899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853263.87922: Set connection var ansible_timeout to 10 11762 1726853263.87931: Set connection var ansible_shell_type to sh 11762 1726853263.87942: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853263.87952: Set connection var ansible_shell_executable to /bin/sh 11762 1726853263.87964: Set connection var ansible_pipelining to False 11762 1726853263.87978: Set connection var ansible_connection to ssh 11762 1726853263.88005: variable 'ansible_shell_executable' from source: unknown 11762 1726853263.88013: variable 'ansible_connection' from source: unknown 11762 1726853263.88020: variable 'ansible_module_compression' from source: unknown 11762 1726853263.88026: variable 'ansible_shell_type' from source: unknown 11762 1726853263.88032: variable 'ansible_shell_executable' from source: unknown 11762 1726853263.88039: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853263.88045: variable 'ansible_pipelining' from source: unknown 11762 1726853263.88051: variable 'ansible_timeout' from source: unknown 11762 1726853263.88058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853263.88192: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853263.88209: variable 'omit' from source: magic vars 11762 1726853263.88218: starting attempt loop 11762 1726853263.88226: running the handler 11762 1726853263.88273: handler run complete 11762 1726853263.88292: attempt loop complete, returning result 11762 1726853263.88299: _execute() done 11762 1726853263.88306: dumping result to json 11762 1726853263.88313: done dumping result, returning 11762 1726853263.88324: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-d845-03d0-000000000278] 11762 1726853263.88333: sending task result for task 02083763-bbaf-d845-03d0-000000000278 11762 1726853263.88454: done sending task result for task 02083763-bbaf-d845-03d0-000000000278 11762 1726853263.88461: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 11762 1726853263.88537: no more pending results, returning what we have 11762 1726853263.88541: results queue empty 11762 1726853263.88541: checking for any_errors_fatal 11762 1726853263.88552: done checking for any_errors_fatal 11762 1726853263.88553: checking for max_fail_percentage 11762 1726853263.88555: done checking for max_fail_percentage 11762 1726853263.88556: checking to see if all hosts have failed and the running result is not ok 11762 1726853263.88557: done checking to see if all hosts have failed 11762 1726853263.88557: getting the remaining hosts for this loop 11762 1726853263.88559: done getting the remaining hosts for this loop 11762 1726853263.88562: getting the next task for host managed_node2 11762 1726853263.88569: done getting next task for host managed_node2 11762 1726853263.88574: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11762 1726853263.88580: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853263.88589: getting variables 11762 1726853263.88591: in VariableManager get_vars() 11762 1726853263.88624: Calling all_inventory to load vars for managed_node2 11762 1726853263.88626: Calling groups_inventory to load vars for managed_node2 11762 1726853263.88628: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853263.88636: Calling all_plugins_play to load vars for managed_node2 11762 1726853263.88639: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853263.88641: Calling groups_plugins_play to load vars for managed_node2 11762 1726853263.89562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853263.90412: done with get_vars() 11762 1726853263.90434: done getting variables 11762 1726853263.90505: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:27:43 -0400 (0:00:00.042) 0:00:14.335 ****** 11762 1726853263.90536: entering _queue_task() for managed_node2/fail 11762 1726853263.90537: Creating lock for fail 11762 1726853263.90808: worker is 1 (out of 1 available) 11762 1726853263.90820: exiting _queue_task() for managed_node2/fail 11762 1726853263.90832: done queuing things up, now waiting for results queue to drain 11762 1726853263.90834: waiting for pending results... 11762 1726853263.91118: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11762 1726853263.91264: in run() - task 02083763-bbaf-d845-03d0-000000000279 11762 1726853263.91287: variable 'ansible_search_path' from source: unknown 11762 1726853263.91299: variable 'ansible_search_path' from source: unknown 11762 1726853263.91338: calling self._execute() 11762 1726853263.91476: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853263.91480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853263.91483: variable 'omit' from source: magic vars 11762 1726853263.91869: variable 'ansible_distribution_major_version' from source: facts 11762 1726853263.91875: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853263.91942: variable 'network_state' from source: role '' defaults 11762 1726853263.91951: Evaluated conditional (network_state != {}): False 11762 1726853263.91955: when evaluation is False, skipping this task 11762 1726853263.91958: _execute() done 11762 1726853263.91961: dumping result to json 11762 1726853263.91964: done dumping result, returning 11762 1726853263.91970: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-d845-03d0-000000000279] 11762 1726853263.91977: sending task result for task 02083763-bbaf-d845-03d0-000000000279 11762 1726853263.92067: done sending task result for task 02083763-bbaf-d845-03d0-000000000279 11762 1726853263.92070: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853263.92143: no more pending results, returning what we have 11762 1726853263.92150: results queue empty 11762 1726853263.92150: checking for any_errors_fatal 11762 1726853263.92157: done checking for any_errors_fatal 11762 1726853263.92157: checking for max_fail_percentage 11762 1726853263.92159: done checking for max_fail_percentage 11762 1726853263.92160: checking to see if all hosts have failed and the running result is not ok 11762 1726853263.92160: done checking to see if all hosts have failed 11762 1726853263.92161: getting the remaining hosts for this loop 11762 1726853263.92163: done getting the remaining hosts for this loop 11762 1726853263.92166: getting the next task for host managed_node2 11762 1726853263.92175: done getting next task for host managed_node2 11762 1726853263.92178: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11762 1726853263.92182: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853263.92197: getting variables 11762 1726853263.92198: in VariableManager get_vars() 11762 1726853263.92228: Calling all_inventory to load vars for managed_node2 11762 1726853263.92231: Calling groups_inventory to load vars for managed_node2 11762 1726853263.92233: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853263.92241: Calling all_plugins_play to load vars for managed_node2 11762 1726853263.92243: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853263.92248: Calling groups_plugins_play to load vars for managed_node2 11762 1726853263.92986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853263.93841: done with get_vars() 11762 1726853263.93858: done getting variables 11762 1726853263.93905: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:27:43 -0400 (0:00:00.033) 0:00:14.369 ****** 11762 1726853263.93930: entering _queue_task() for managed_node2/fail 11762 1726853263.94157: worker is 1 (out of 1 available) 11762 1726853263.94173: exiting _queue_task() for managed_node2/fail 11762 1726853263.94185: done queuing things up, now waiting for results queue to drain 11762 1726853263.94186: waiting for pending results... 11762 1726853263.94352: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11762 1726853263.94446: in run() - task 02083763-bbaf-d845-03d0-00000000027a 11762 1726853263.94460: variable 'ansible_search_path' from source: unknown 11762 1726853263.94464: variable 'ansible_search_path' from source: unknown 11762 1726853263.94492: calling self._execute() 11762 1726853263.94557: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853263.94560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853263.94569: variable 'omit' from source: magic vars 11762 1726853263.94833: variable 'ansible_distribution_major_version' from source: facts 11762 1726853263.94843: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853263.94924: variable 'network_state' from source: role '' defaults 11762 1726853263.94935: Evaluated conditional (network_state != {}): False 11762 1726853263.94939: when evaluation is False, skipping this task 11762 1726853263.94942: _execute() done 11762 1726853263.94944: dumping result to json 11762 1726853263.94946: done dumping result, returning 11762 1726853263.94956: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-d845-03d0-00000000027a] 11762 1726853263.94958: sending task result for task 02083763-bbaf-d845-03d0-00000000027a 11762 1726853263.95046: done sending task result for task 02083763-bbaf-d845-03d0-00000000027a 11762 1726853263.95048: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853263.95119: no more pending results, returning what we have 11762 1726853263.95122: results queue empty 11762 1726853263.95123: checking for any_errors_fatal 11762 1726853263.95131: done checking for any_errors_fatal 11762 1726853263.95131: checking for max_fail_percentage 11762 1726853263.95133: done checking for max_fail_percentage 11762 1726853263.95134: checking to see if all hosts have failed and the running result is not ok 11762 1726853263.95134: done checking to see if all hosts have failed 11762 1726853263.95135: getting the remaining hosts for this loop 11762 1726853263.95137: done getting the remaining hosts for this loop 11762 1726853263.95140: getting the next task for host managed_node2 11762 1726853263.95147: done getting next task for host managed_node2 11762 1726853263.95150: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11762 1726853263.95154: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853263.95168: getting variables 11762 1726853263.95169: in VariableManager get_vars() 11762 1726853263.95199: Calling all_inventory to load vars for managed_node2 11762 1726853263.95201: Calling groups_inventory to load vars for managed_node2 11762 1726853263.95203: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853263.95211: Calling all_plugins_play to load vars for managed_node2 11762 1726853263.95213: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853263.95216: Calling groups_plugins_play to load vars for managed_node2 11762 1726853263.96002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853263.96848: done with get_vars() 11762 1726853263.96862: done getting variables 11762 1726853263.96905: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:27:43 -0400 (0:00:00.029) 0:00:14.399 ****** 11762 1726853263.96927: entering _queue_task() for managed_node2/fail 11762 1726853263.97133: worker is 1 (out of 1 available) 11762 1726853263.97145: exiting _queue_task() for managed_node2/fail 11762 1726853263.97157: done queuing things up, now waiting for results queue to drain 11762 1726853263.97159: waiting for pending results... 11762 1726853263.97326: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11762 1726853263.97409: in run() - task 02083763-bbaf-d845-03d0-00000000027b 11762 1726853263.97421: variable 'ansible_search_path' from source: unknown 11762 1726853263.97425: variable 'ansible_search_path' from source: unknown 11762 1726853263.97454: calling self._execute() 11762 1726853263.97518: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853263.97522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853263.97530: variable 'omit' from source: magic vars 11762 1726853263.97791: variable 'ansible_distribution_major_version' from source: facts 11762 1726853263.97801: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853263.97919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853263.99412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853263.99466: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853263.99494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853263.99519: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853263.99538: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853263.99601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853263.99620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853263.99638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853263.99668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853263.99680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853263.99746: variable 'ansible_distribution_major_version' from source: facts 11762 1726853263.99762: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11762 1726853263.99841: variable 'ansible_distribution' from source: facts 11762 1726853263.99844: variable '__network_rh_distros' from source: role '' defaults 11762 1726853263.99854: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11762 1726853264.00013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.00030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.00048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.00075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.00086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.00119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.00135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.00154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.00179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.00189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.00223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.00237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.00256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.00281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.00291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.00485: variable 'network_connections' from source: include params 11762 1726853264.00494: variable 'controller_profile' from source: play vars 11762 1726853264.00537: variable 'controller_profile' from source: play vars 11762 1726853264.00550: variable 'controller_device' from source: play vars 11762 1726853264.00593: variable 'controller_device' from source: play vars 11762 1726853264.00604: variable 'port1_profile' from source: play vars 11762 1726853264.00645: variable 'port1_profile' from source: play vars 11762 1726853264.00653: variable 'dhcp_interface1' from source: play vars 11762 1726853264.00703: variable 'dhcp_interface1' from source: play vars 11762 1726853264.00707: variable 'controller_profile' from source: play vars 11762 1726853264.00750: variable 'controller_profile' from source: play vars 11762 1726853264.00753: variable 'port2_profile' from source: play vars 11762 1726853264.00800: variable 'port2_profile' from source: play vars 11762 1726853264.00807: variable 'dhcp_interface2' from source: play vars 11762 1726853264.00849: variable 'dhcp_interface2' from source: play vars 11762 1726853264.00853: variable 'controller_profile' from source: play vars 11762 1726853264.00900: variable 'controller_profile' from source: play vars 11762 1726853264.00907: variable 'network_state' from source: role '' defaults 11762 1726853264.00951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853264.01061: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853264.01095: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853264.01115: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853264.01138: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853264.01181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853264.01203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853264.01218: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.01235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853264.01265: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11762 1726853264.01268: when evaluation is False, skipping this task 11762 1726853264.01272: _execute() done 11762 1726853264.01275: dumping result to json 11762 1726853264.01276: done dumping result, returning 11762 1726853264.01282: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-d845-03d0-00000000027b] 11762 1726853264.01287: sending task result for task 02083763-bbaf-d845-03d0-00000000027b 11762 1726853264.01377: done sending task result for task 02083763-bbaf-d845-03d0-00000000027b 11762 1726853264.01379: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11762 1726853264.01456: no more pending results, returning what we have 11762 1726853264.01460: results queue empty 11762 1726853264.01460: checking for any_errors_fatal 11762 1726853264.01467: done checking for any_errors_fatal 11762 1726853264.01468: checking for max_fail_percentage 11762 1726853264.01470: done checking for max_fail_percentage 11762 1726853264.01475: checking to see if all hosts have failed and the running result is not ok 11762 1726853264.01476: done checking to see if all hosts have failed 11762 1726853264.01477: getting the remaining hosts for this loop 11762 1726853264.01479: done getting the remaining hosts for this loop 11762 1726853264.01483: getting the next task for host managed_node2 11762 1726853264.01490: done getting next task for host managed_node2 11762 1726853264.01493: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11762 1726853264.01498: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853264.01510: getting variables 11762 1726853264.01511: in VariableManager get_vars() 11762 1726853264.01546: Calling all_inventory to load vars for managed_node2 11762 1726853264.01549: Calling groups_inventory to load vars for managed_node2 11762 1726853264.01551: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853264.01559: Calling all_plugins_play to load vars for managed_node2 11762 1726853264.01561: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853264.01563: Calling groups_plugins_play to load vars for managed_node2 11762 1726853264.02339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853264.03287: done with get_vars() 11762 1726853264.03302: done getting variables 11762 1726853264.03375: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:27:44 -0400 (0:00:00.064) 0:00:14.464 ****** 11762 1726853264.03397: entering _queue_task() for managed_node2/dnf 11762 1726853264.03623: worker is 1 (out of 1 available) 11762 1726853264.03638: exiting _queue_task() for managed_node2/dnf 11762 1726853264.03653: done queuing things up, now waiting for results queue to drain 11762 1726853264.03655: waiting for pending results... 11762 1726853264.03818: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11762 1726853264.03904: in run() - task 02083763-bbaf-d845-03d0-00000000027c 11762 1726853264.03916: variable 'ansible_search_path' from source: unknown 11762 1726853264.03920: variable 'ansible_search_path' from source: unknown 11762 1726853264.03949: calling self._execute() 11762 1726853264.04012: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853264.04017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853264.04026: variable 'omit' from source: magic vars 11762 1726853264.04284: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.04293: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853264.04425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853264.05886: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853264.05935: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853264.05969: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853264.05998: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853264.06017: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853264.06080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.06099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.06116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.06141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.06154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.06235: variable 'ansible_distribution' from source: facts 11762 1726853264.06238: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.06251: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11762 1726853264.06326: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853264.06411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.06427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.06446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.06469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.06483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.06512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.06528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.06546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.06568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.06582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.06611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.06627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.06646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.06668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.06680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.06778: variable 'network_connections' from source: include params 11762 1726853264.06788: variable 'controller_profile' from source: play vars 11762 1726853264.06834: variable 'controller_profile' from source: play vars 11762 1726853264.06841: variable 'controller_device' from source: play vars 11762 1726853264.06884: variable 'controller_device' from source: play vars 11762 1726853264.06894: variable 'port1_profile' from source: play vars 11762 1726853264.06938: variable 'port1_profile' from source: play vars 11762 1726853264.06946: variable 'dhcp_interface1' from source: play vars 11762 1726853264.06987: variable 'dhcp_interface1' from source: play vars 11762 1726853264.06992: variable 'controller_profile' from source: play vars 11762 1726853264.07038: variable 'controller_profile' from source: play vars 11762 1726853264.07042: variable 'port2_profile' from source: play vars 11762 1726853264.07086: variable 'port2_profile' from source: play vars 11762 1726853264.07093: variable 'dhcp_interface2' from source: play vars 11762 1726853264.07133: variable 'dhcp_interface2' from source: play vars 11762 1726853264.07140: variable 'controller_profile' from source: play vars 11762 1726853264.07283: variable 'controller_profile' from source: play vars 11762 1726853264.07294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853264.07388: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853264.07411: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853264.07433: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853264.07456: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853264.07490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853264.07518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853264.07536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.07555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853264.07602: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853264.07750: variable 'network_connections' from source: include params 11762 1726853264.07759: variable 'controller_profile' from source: play vars 11762 1726853264.07808: variable 'controller_profile' from source: play vars 11762 1726853264.07815: variable 'controller_device' from source: play vars 11762 1726853264.07859: variable 'controller_device' from source: play vars 11762 1726853264.07869: variable 'port1_profile' from source: play vars 11762 1726853264.07912: variable 'port1_profile' from source: play vars 11762 1726853264.07918: variable 'dhcp_interface1' from source: play vars 11762 1726853264.07961: variable 'dhcp_interface1' from source: play vars 11762 1726853264.07966: variable 'controller_profile' from source: play vars 11762 1726853264.08008: variable 'controller_profile' from source: play vars 11762 1726853264.08013: variable 'port2_profile' from source: play vars 11762 1726853264.08057: variable 'port2_profile' from source: play vars 11762 1726853264.08063: variable 'dhcp_interface2' from source: play vars 11762 1726853264.08105: variable 'dhcp_interface2' from source: play vars 11762 1726853264.08111: variable 'controller_profile' from source: play vars 11762 1726853264.08153: variable 'controller_profile' from source: play vars 11762 1726853264.08182: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853264.08185: when evaluation is False, skipping this task 11762 1726853264.08188: _execute() done 11762 1726853264.08190: dumping result to json 11762 1726853264.08193: done dumping result, returning 11762 1726853264.08200: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-d845-03d0-00000000027c] 11762 1726853264.08204: sending task result for task 02083763-bbaf-d845-03d0-00000000027c 11762 1726853264.08302: done sending task result for task 02083763-bbaf-d845-03d0-00000000027c 11762 1726853264.08304: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853264.08351: no more pending results, returning what we have 11762 1726853264.08355: results queue empty 11762 1726853264.08356: checking for any_errors_fatal 11762 1726853264.08362: done checking for any_errors_fatal 11762 1726853264.08363: checking for max_fail_percentage 11762 1726853264.08365: done checking for max_fail_percentage 11762 1726853264.08365: checking to see if all hosts have failed and the running result is not ok 11762 1726853264.08366: done checking to see if all hosts have failed 11762 1726853264.08366: getting the remaining hosts for this loop 11762 1726853264.08368: done getting the remaining hosts for this loop 11762 1726853264.08373: getting the next task for host managed_node2 11762 1726853264.08381: done getting next task for host managed_node2 11762 1726853264.08385: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11762 1726853264.08390: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853264.08404: getting variables 11762 1726853264.08405: in VariableManager get_vars() 11762 1726853264.08439: Calling all_inventory to load vars for managed_node2 11762 1726853264.08442: Calling groups_inventory to load vars for managed_node2 11762 1726853264.08444: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853264.08452: Calling all_plugins_play to load vars for managed_node2 11762 1726853264.08455: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853264.08457: Calling groups_plugins_play to load vars for managed_node2 11762 1726853264.09380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853264.11190: done with get_vars() 11762 1726853264.11211: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11762 1726853264.11273: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:27:44 -0400 (0:00:00.078) 0:00:14.543 ****** 11762 1726853264.11297: entering _queue_task() for managed_node2/yum 11762 1726853264.11299: Creating lock for yum 11762 1726853264.11543: worker is 1 (out of 1 available) 11762 1726853264.11556: exiting _queue_task() for managed_node2/yum 11762 1726853264.11569: done queuing things up, now waiting for results queue to drain 11762 1726853264.11573: waiting for pending results... 11762 1726853264.11753: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11762 1726853264.11842: in run() - task 02083763-bbaf-d845-03d0-00000000027d 11762 1726853264.11857: variable 'ansible_search_path' from source: unknown 11762 1726853264.11861: variable 'ansible_search_path' from source: unknown 11762 1726853264.11890: calling self._execute() 11762 1726853264.11957: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853264.11960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853264.11968: variable 'omit' from source: magic vars 11762 1726853264.12240: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.12252: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853264.12375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853264.15904: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853264.15987: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853264.16227: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853264.16231: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853264.16233: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853264.16553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.16556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.16558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.16560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.16770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.16775: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.16887: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11762 1726853264.16895: when evaluation is False, skipping this task 11762 1726853264.16902: _execute() done 11762 1726853264.16908: dumping result to json 11762 1726853264.16915: done dumping result, returning 11762 1726853264.16925: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-d845-03d0-00000000027d] 11762 1726853264.16934: sending task result for task 02083763-bbaf-d845-03d0-00000000027d skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11762 1726853264.17088: no more pending results, returning what we have 11762 1726853264.17092: results queue empty 11762 1726853264.17093: checking for any_errors_fatal 11762 1726853264.17100: done checking for any_errors_fatal 11762 1726853264.17101: checking for max_fail_percentage 11762 1726853264.17103: done checking for max_fail_percentage 11762 1726853264.17104: checking to see if all hosts have failed and the running result is not ok 11762 1726853264.17104: done checking to see if all hosts have failed 11762 1726853264.17105: getting the remaining hosts for this loop 11762 1726853264.17107: done getting the remaining hosts for this loop 11762 1726853264.17110: getting the next task for host managed_node2 11762 1726853264.17120: done getting next task for host managed_node2 11762 1726853264.17124: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11762 1726853264.17130: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853264.17146: getting variables 11762 1726853264.17147: in VariableManager get_vars() 11762 1726853264.17190: Calling all_inventory to load vars for managed_node2 11762 1726853264.17193: Calling groups_inventory to load vars for managed_node2 11762 1726853264.17195: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853264.17207: Calling all_plugins_play to load vars for managed_node2 11762 1726853264.17210: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853264.17213: Calling groups_plugins_play to load vars for managed_node2 11762 1726853264.18478: done sending task result for task 02083763-bbaf-d845-03d0-00000000027d 11762 1726853264.18481: WORKER PROCESS EXITING 11762 1726853264.20721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853264.24157: done with get_vars() 11762 1726853264.24302: done getting variables 11762 1726853264.24370: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:27:44 -0400 (0:00:00.131) 0:00:14.674 ****** 11762 1726853264.24479: entering _queue_task() for managed_node2/fail 11762 1726853264.24929: worker is 1 (out of 1 available) 11762 1726853264.24942: exiting _queue_task() for managed_node2/fail 11762 1726853264.24955: done queuing things up, now waiting for results queue to drain 11762 1726853264.24957: waiting for pending results... 11762 1726853264.25563: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11762 1726853264.25859: in run() - task 02083763-bbaf-d845-03d0-00000000027e 11762 1726853264.25935: variable 'ansible_search_path' from source: unknown 11762 1726853264.25948: variable 'ansible_search_path' from source: unknown 11762 1726853264.25994: calling self._execute() 11762 1726853264.26224: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853264.26288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853264.26303: variable 'omit' from source: magic vars 11762 1726853264.27191: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.27207: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853264.27461: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853264.28035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853264.32329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853264.32416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853264.32454: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853264.32497: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853264.32527: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853264.32613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.32646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.32677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.32728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.32823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.32826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.32829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.32855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.32898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.32915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.32965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.32993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.33019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.33069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.33088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.33277: variable 'network_connections' from source: include params 11762 1726853264.33368: variable 'controller_profile' from source: play vars 11762 1726853264.33377: variable 'controller_profile' from source: play vars 11762 1726853264.33394: variable 'controller_device' from source: play vars 11762 1726853264.33465: variable 'controller_device' from source: play vars 11762 1726853264.33500: variable 'port1_profile' from source: play vars 11762 1726853264.33558: variable 'port1_profile' from source: play vars 11762 1726853264.33572: variable 'dhcp_interface1' from source: play vars 11762 1726853264.33645: variable 'dhcp_interface1' from source: play vars 11762 1726853264.33694: variable 'controller_profile' from source: play vars 11762 1726853264.33727: variable 'controller_profile' from source: play vars 11762 1726853264.33738: variable 'port2_profile' from source: play vars 11762 1726853264.33802: variable 'port2_profile' from source: play vars 11762 1726853264.33820: variable 'dhcp_interface2' from source: play vars 11762 1726853264.33883: variable 'dhcp_interface2' from source: play vars 11762 1726853264.33911: variable 'controller_profile' from source: play vars 11762 1726853264.33965: variable 'controller_profile' from source: play vars 11762 1726853264.34054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853264.34254: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853264.34288: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853264.34346: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853264.34360: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853264.34410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853264.34435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853264.34476: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.34564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853264.34586: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853264.34852: variable 'network_connections' from source: include params 11762 1726853264.34862: variable 'controller_profile' from source: play vars 11762 1726853264.34935: variable 'controller_profile' from source: play vars 11762 1726853264.34947: variable 'controller_device' from source: play vars 11762 1726853264.35102: variable 'controller_device' from source: play vars 11762 1726853264.35106: variable 'port1_profile' from source: play vars 11762 1726853264.35108: variable 'port1_profile' from source: play vars 11762 1726853264.35111: variable 'dhcp_interface1' from source: play vars 11762 1726853264.35170: variable 'dhcp_interface1' from source: play vars 11762 1726853264.35185: variable 'controller_profile' from source: play vars 11762 1726853264.35259: variable 'controller_profile' from source: play vars 11762 1726853264.35272: variable 'port2_profile' from source: play vars 11762 1726853264.35347: variable 'port2_profile' from source: play vars 11762 1726853264.35359: variable 'dhcp_interface2' from source: play vars 11762 1726853264.35424: variable 'dhcp_interface2' from source: play vars 11762 1726853264.35438: variable 'controller_profile' from source: play vars 11762 1726853264.35504: variable 'controller_profile' from source: play vars 11762 1726853264.35545: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853264.35558: when evaluation is False, skipping this task 11762 1726853264.35565: _execute() done 11762 1726853264.35572: dumping result to json 11762 1726853264.35579: done dumping result, returning 11762 1726853264.35590: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-d845-03d0-00000000027e] 11762 1726853264.35598: sending task result for task 02083763-bbaf-d845-03d0-00000000027e skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853264.35862: no more pending results, returning what we have 11762 1726853264.35867: results queue empty 11762 1726853264.35867: checking for any_errors_fatal 11762 1726853264.35874: done checking for any_errors_fatal 11762 1726853264.35875: checking for max_fail_percentage 11762 1726853264.35877: done checking for max_fail_percentage 11762 1726853264.35878: checking to see if all hosts have failed and the running result is not ok 11762 1726853264.35878: done checking to see if all hosts have failed 11762 1726853264.35879: getting the remaining hosts for this loop 11762 1726853264.35881: done getting the remaining hosts for this loop 11762 1726853264.35884: getting the next task for host managed_node2 11762 1726853264.35892: done getting next task for host managed_node2 11762 1726853264.35895: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11762 1726853264.35900: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853264.35916: getting variables 11762 1726853264.35917: in VariableManager get_vars() 11762 1726853264.35954: Calling all_inventory to load vars for managed_node2 11762 1726853264.35956: Calling groups_inventory to load vars for managed_node2 11762 1726853264.35959: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853264.35968: Calling all_plugins_play to load vars for managed_node2 11762 1726853264.36086: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853264.36091: Calling groups_plugins_play to load vars for managed_node2 11762 1726853264.36783: done sending task result for task 02083763-bbaf-d845-03d0-00000000027e 11762 1726853264.36786: WORKER PROCESS EXITING 11762 1726853264.37536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853264.39220: done with get_vars() 11762 1726853264.39253: done getting variables 11762 1726853264.39317: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:27:44 -0400 (0:00:00.149) 0:00:14.824 ****** 11762 1726853264.39359: entering _queue_task() for managed_node2/package 11762 1726853264.39876: worker is 1 (out of 1 available) 11762 1726853264.39888: exiting _queue_task() for managed_node2/package 11762 1726853264.39901: done queuing things up, now waiting for results queue to drain 11762 1726853264.39902: waiting for pending results... 11762 1726853264.39993: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 11762 1726853264.40133: in run() - task 02083763-bbaf-d845-03d0-00000000027f 11762 1726853264.40153: variable 'ansible_search_path' from source: unknown 11762 1726853264.40161: variable 'ansible_search_path' from source: unknown 11762 1726853264.40201: calling self._execute() 11762 1726853264.40295: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853264.40306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853264.40321: variable 'omit' from source: magic vars 11762 1726853264.40778: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.40782: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853264.40927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853264.41203: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853264.41258: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853264.41296: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853264.41340: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853264.41454: variable 'network_packages' from source: role '' defaults 11762 1726853264.41565: variable '__network_provider_setup' from source: role '' defaults 11762 1726853264.41582: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853264.41651: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853264.41665: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853264.41759: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853264.41922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853264.50178: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853264.50254: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853264.50349: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853264.50352: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853264.50391: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853264.50476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.50511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.50549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.50781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.50784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.50787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.50790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.50792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.50794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.50796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.51068: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11762 1726853264.51192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.51229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.51260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.51303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.51329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.51418: variable 'ansible_python' from source: facts 11762 1726853264.51447: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11762 1726853264.51532: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853264.51622: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853264.51761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.51793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.51823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.51873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.51894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.51940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.51984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.52015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.52056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.52079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.52237: variable 'network_connections' from source: include params 11762 1726853264.52248: variable 'controller_profile' from source: play vars 11762 1726853264.52356: variable 'controller_profile' from source: play vars 11762 1726853264.52374: variable 'controller_device' from source: play vars 11762 1726853264.52481: variable 'controller_device' from source: play vars 11762 1726853264.52502: variable 'port1_profile' from source: play vars 11762 1726853264.52633: variable 'port1_profile' from source: play vars 11762 1726853264.52636: variable 'dhcp_interface1' from source: play vars 11762 1726853264.52721: variable 'dhcp_interface1' from source: play vars 11762 1726853264.52737: variable 'controller_profile' from source: play vars 11762 1726853264.52849: variable 'controller_profile' from source: play vars 11762 1726853264.52852: variable 'port2_profile' from source: play vars 11762 1726853264.52945: variable 'port2_profile' from source: play vars 11762 1726853264.52967: variable 'dhcp_interface2' from source: play vars 11762 1726853264.53069: variable 'dhcp_interface2' from source: play vars 11762 1726853264.53087: variable 'controller_profile' from source: play vars 11762 1726853264.53189: variable 'controller_profile' from source: play vars 11762 1726853264.53252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853264.53287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853264.53397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.53400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853264.53403: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853264.53696: variable 'network_connections' from source: include params 11762 1726853264.53707: variable 'controller_profile' from source: play vars 11762 1726853264.53813: variable 'controller_profile' from source: play vars 11762 1726853264.53841: variable 'controller_device' from source: play vars 11762 1726853264.53950: variable 'controller_device' from source: play vars 11762 1726853264.53964: variable 'port1_profile' from source: play vars 11762 1726853264.54155: variable 'port1_profile' from source: play vars 11762 1726853264.54158: variable 'dhcp_interface1' from source: play vars 11762 1726853264.54194: variable 'dhcp_interface1' from source: play vars 11762 1726853264.54208: variable 'controller_profile' from source: play vars 11762 1726853264.54317: variable 'controller_profile' from source: play vars 11762 1726853264.54331: variable 'port2_profile' from source: play vars 11762 1726853264.54439: variable 'port2_profile' from source: play vars 11762 1726853264.54454: variable 'dhcp_interface2' from source: play vars 11762 1726853264.54560: variable 'dhcp_interface2' from source: play vars 11762 1726853264.54583: variable 'controller_profile' from source: play vars 11762 1726853264.54684: variable 'controller_profile' from source: play vars 11762 1726853264.54750: variable '__network_packages_default_wireless' from source: role '' defaults 11762 1726853264.54840: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853264.55175: variable 'network_connections' from source: include params 11762 1726853264.55186: variable 'controller_profile' from source: play vars 11762 1726853264.55347: variable 'controller_profile' from source: play vars 11762 1726853264.55350: variable 'controller_device' from source: play vars 11762 1726853264.55353: variable 'controller_device' from source: play vars 11762 1726853264.55357: variable 'port1_profile' from source: play vars 11762 1726853264.55418: variable 'port1_profile' from source: play vars 11762 1726853264.55431: variable 'dhcp_interface1' from source: play vars 11762 1726853264.55506: variable 'dhcp_interface1' from source: play vars 11762 1726853264.55518: variable 'controller_profile' from source: play vars 11762 1726853264.55676: variable 'controller_profile' from source: play vars 11762 1726853264.55681: variable 'port2_profile' from source: play vars 11762 1726853264.55683: variable 'port2_profile' from source: play vars 11762 1726853264.55685: variable 'dhcp_interface2' from source: play vars 11762 1726853264.55743: variable 'dhcp_interface2' from source: play vars 11762 1726853264.55754: variable 'controller_profile' from source: play vars 11762 1726853264.55876: variable 'controller_profile' from source: play vars 11762 1726853264.55879: variable '__network_packages_default_team' from source: role '' defaults 11762 1726853264.55943: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853264.56291: variable 'network_connections' from source: include params 11762 1726853264.56301: variable 'controller_profile' from source: play vars 11762 1726853264.56378: variable 'controller_profile' from source: play vars 11762 1726853264.56390: variable 'controller_device' from source: play vars 11762 1726853264.56463: variable 'controller_device' from source: play vars 11762 1726853264.56484: variable 'port1_profile' from source: play vars 11762 1726853264.56643: variable 'port1_profile' from source: play vars 11762 1726853264.56647: variable 'dhcp_interface1' from source: play vars 11762 1726853264.56650: variable 'dhcp_interface1' from source: play vars 11762 1726853264.56652: variable 'controller_profile' from source: play vars 11762 1726853264.56712: variable 'controller_profile' from source: play vars 11762 1726853264.56769: variable 'port2_profile' from source: play vars 11762 1726853264.56796: variable 'port2_profile' from source: play vars 11762 1726853264.56807: variable 'dhcp_interface2' from source: play vars 11762 1726853264.56876: variable 'dhcp_interface2' from source: play vars 11762 1726853264.56890: variable 'controller_profile' from source: play vars 11762 1726853264.56953: variable 'controller_profile' from source: play vars 11762 1726853264.57027: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853264.57275: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853264.57279: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853264.57281: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853264.57397: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11762 1726853264.57942: variable 'network_connections' from source: include params 11762 1726853264.57945: variable 'controller_profile' from source: play vars 11762 1726853264.57978: variable 'controller_profile' from source: play vars 11762 1726853264.57995: variable 'controller_device' from source: play vars 11762 1726853264.58062: variable 'controller_device' from source: play vars 11762 1726853264.58083: variable 'port1_profile' from source: play vars 11762 1726853264.58145: variable 'port1_profile' from source: play vars 11762 1726853264.58164: variable 'dhcp_interface1' from source: play vars 11762 1726853264.58269: variable 'dhcp_interface1' from source: play vars 11762 1726853264.58274: variable 'controller_profile' from source: play vars 11762 1726853264.58305: variable 'controller_profile' from source: play vars 11762 1726853264.58316: variable 'port2_profile' from source: play vars 11762 1726853264.58385: variable 'port2_profile' from source: play vars 11762 1726853264.58396: variable 'dhcp_interface2' from source: play vars 11762 1726853264.58456: variable 'dhcp_interface2' from source: play vars 11762 1726853264.58467: variable 'controller_profile' from source: play vars 11762 1726853264.58595: variable 'controller_profile' from source: play vars 11762 1726853264.58598: variable 'ansible_distribution' from source: facts 11762 1726853264.58600: variable '__network_rh_distros' from source: role '' defaults 11762 1726853264.58602: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.58605: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11762 1726853264.58764: variable 'ansible_distribution' from source: facts 11762 1726853264.58775: variable '__network_rh_distros' from source: role '' defaults 11762 1726853264.58785: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.58800: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11762 1726853264.58974: variable 'ansible_distribution' from source: facts 11762 1726853264.58984: variable '__network_rh_distros' from source: role '' defaults 11762 1726853264.58993: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.59039: variable 'network_provider' from source: set_fact 11762 1726853264.59059: variable 'ansible_facts' from source: unknown 11762 1726853264.59787: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11762 1726853264.59790: when evaluation is False, skipping this task 11762 1726853264.59793: _execute() done 11762 1726853264.59795: dumping result to json 11762 1726853264.59797: done dumping result, returning 11762 1726853264.59799: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-d845-03d0-00000000027f] 11762 1726853264.59801: sending task result for task 02083763-bbaf-d845-03d0-00000000027f 11762 1726853264.59875: done sending task result for task 02083763-bbaf-d845-03d0-00000000027f 11762 1726853264.59878: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11762 1726853264.59934: no more pending results, returning what we have 11762 1726853264.59938: results queue empty 11762 1726853264.59939: checking for any_errors_fatal 11762 1726853264.59943: done checking for any_errors_fatal 11762 1726853264.59944: checking for max_fail_percentage 11762 1726853264.59946: done checking for max_fail_percentage 11762 1726853264.59947: checking to see if all hosts have failed and the running result is not ok 11762 1726853264.59948: done checking to see if all hosts have failed 11762 1726853264.59948: getting the remaining hosts for this loop 11762 1726853264.59950: done getting the remaining hosts for this loop 11762 1726853264.59954: getting the next task for host managed_node2 11762 1726853264.59961: done getting next task for host managed_node2 11762 1726853264.59965: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11762 1726853264.59972: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853264.59988: getting variables 11762 1726853264.59989: in VariableManager get_vars() 11762 1726853264.60102: Calling all_inventory to load vars for managed_node2 11762 1726853264.60109: Calling groups_inventory to load vars for managed_node2 11762 1726853264.60112: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853264.60122: Calling all_plugins_play to load vars for managed_node2 11762 1726853264.60125: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853264.60128: Calling groups_plugins_play to load vars for managed_node2 11762 1726853264.64969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853264.66313: done with get_vars() 11762 1726853264.66334: done getting variables 11762 1726853264.66376: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:27:44 -0400 (0:00:00.270) 0:00:15.094 ****** 11762 1726853264.66398: entering _queue_task() for managed_node2/package 11762 1726853264.66644: worker is 1 (out of 1 available) 11762 1726853264.66656: exiting _queue_task() for managed_node2/package 11762 1726853264.66670: done queuing things up, now waiting for results queue to drain 11762 1726853264.66673: waiting for pending results... 11762 1726853264.66853: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11762 1726853264.66948: in run() - task 02083763-bbaf-d845-03d0-000000000280 11762 1726853264.66963: variable 'ansible_search_path' from source: unknown 11762 1726853264.66968: variable 'ansible_search_path' from source: unknown 11762 1726853264.66997: calling self._execute() 11762 1726853264.67073: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853264.67077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853264.67087: variable 'omit' from source: magic vars 11762 1726853264.67397: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.67407: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853264.67495: variable 'network_state' from source: role '' defaults 11762 1726853264.67502: Evaluated conditional (network_state != {}): False 11762 1726853264.67505: when evaluation is False, skipping this task 11762 1726853264.67507: _execute() done 11762 1726853264.67510: dumping result to json 11762 1726853264.67512: done dumping result, returning 11762 1726853264.67520: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-d845-03d0-000000000280] 11762 1726853264.67526: sending task result for task 02083763-bbaf-d845-03d0-000000000280 11762 1726853264.67620: done sending task result for task 02083763-bbaf-d845-03d0-000000000280 11762 1726853264.67622: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853264.67701: no more pending results, returning what we have 11762 1726853264.67706: results queue empty 11762 1726853264.67706: checking for any_errors_fatal 11762 1726853264.67717: done checking for any_errors_fatal 11762 1726853264.67718: checking for max_fail_percentage 11762 1726853264.67720: done checking for max_fail_percentage 11762 1726853264.67721: checking to see if all hosts have failed and the running result is not ok 11762 1726853264.67721: done checking to see if all hosts have failed 11762 1726853264.67722: getting the remaining hosts for this loop 11762 1726853264.67724: done getting the remaining hosts for this loop 11762 1726853264.67727: getting the next task for host managed_node2 11762 1726853264.67736: done getting next task for host managed_node2 11762 1726853264.67739: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11762 1726853264.67746: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853264.67760: getting variables 11762 1726853264.67761: in VariableManager get_vars() 11762 1726853264.67792: Calling all_inventory to load vars for managed_node2 11762 1726853264.67795: Calling groups_inventory to load vars for managed_node2 11762 1726853264.67797: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853264.67805: Calling all_plugins_play to load vars for managed_node2 11762 1726853264.67807: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853264.67809: Calling groups_plugins_play to load vars for managed_node2 11762 1726853264.68856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853264.70804: done with get_vars() 11762 1726853264.70828: done getting variables 11762 1726853264.70875: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:27:44 -0400 (0:00:00.044) 0:00:15.139 ****** 11762 1726853264.70901: entering _queue_task() for managed_node2/package 11762 1726853264.71146: worker is 1 (out of 1 available) 11762 1726853264.71161: exiting _queue_task() for managed_node2/package 11762 1726853264.71175: done queuing things up, now waiting for results queue to drain 11762 1726853264.71177: waiting for pending results... 11762 1726853264.71352: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11762 1726853264.71485: in run() - task 02083763-bbaf-d845-03d0-000000000281 11762 1726853264.71525: variable 'ansible_search_path' from source: unknown 11762 1726853264.71529: variable 'ansible_search_path' from source: unknown 11762 1726853264.71537: calling self._execute() 11762 1726853264.71636: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853264.71654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853264.71657: variable 'omit' from source: magic vars 11762 1726853264.72027: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.72175: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853264.72178: variable 'network_state' from source: role '' defaults 11762 1726853264.72183: Evaluated conditional (network_state != {}): False 11762 1726853264.72190: when evaluation is False, skipping this task 11762 1726853264.72197: _execute() done 11762 1726853264.72202: dumping result to json 11762 1726853264.72208: done dumping result, returning 11762 1726853264.72220: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-d845-03d0-000000000281] 11762 1726853264.72231: sending task result for task 02083763-bbaf-d845-03d0-000000000281 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853264.72392: no more pending results, returning what we have 11762 1726853264.72397: results queue empty 11762 1726853264.72398: checking for any_errors_fatal 11762 1726853264.72406: done checking for any_errors_fatal 11762 1726853264.72407: checking for max_fail_percentage 11762 1726853264.72409: done checking for max_fail_percentage 11762 1726853264.72410: checking to see if all hosts have failed and the running result is not ok 11762 1726853264.72410: done checking to see if all hosts have failed 11762 1726853264.72411: getting the remaining hosts for this loop 11762 1726853264.72413: done getting the remaining hosts for this loop 11762 1726853264.72416: getting the next task for host managed_node2 11762 1726853264.72424: done getting next task for host managed_node2 11762 1726853264.72428: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11762 1726853264.72433: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853264.72449: getting variables 11762 1726853264.72450: in VariableManager get_vars() 11762 1726853264.72494: Calling all_inventory to load vars for managed_node2 11762 1726853264.72498: Calling groups_inventory to load vars for managed_node2 11762 1726853264.72501: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853264.72514: Calling all_plugins_play to load vars for managed_node2 11762 1726853264.72518: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853264.72521: Calling groups_plugins_play to load vars for managed_node2 11762 1726853264.73183: done sending task result for task 02083763-bbaf-d845-03d0-000000000281 11762 1726853264.73187: WORKER PROCESS EXITING 11762 1726853264.75077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853264.76662: done with get_vars() 11762 1726853264.76688: done getting variables 11762 1726853264.76846: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:27:44 -0400 (0:00:00.059) 0:00:15.199 ****** 11762 1726853264.76883: entering _queue_task() for managed_node2/service 11762 1726853264.76885: Creating lock for service 11762 1726853264.77611: worker is 1 (out of 1 available) 11762 1726853264.77623: exiting _queue_task() for managed_node2/service 11762 1726853264.77635: done queuing things up, now waiting for results queue to drain 11762 1726853264.77637: waiting for pending results... 11762 1726853264.77891: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11762 1726853264.78151: in run() - task 02083763-bbaf-d845-03d0-000000000282 11762 1726853264.78176: variable 'ansible_search_path' from source: unknown 11762 1726853264.78190: variable 'ansible_search_path' from source: unknown 11762 1726853264.78232: calling self._execute() 11762 1726853264.78387: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853264.78404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853264.78419: variable 'omit' from source: magic vars 11762 1726853264.78824: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.78848: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853264.78977: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853264.79184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853264.81374: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853264.81461: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853264.81503: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853264.81543: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853264.81579: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853264.81665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.81700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.81729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.81875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.81879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.81881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.81884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.81899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.81941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.81960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.82009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.82035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.82064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.82113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.82131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.82313: variable 'network_connections' from source: include params 11762 1726853264.82335: variable 'controller_profile' from source: play vars 11762 1726853264.82408: variable 'controller_profile' from source: play vars 11762 1726853264.82430: variable 'controller_device' from source: play vars 11762 1726853264.82541: variable 'controller_device' from source: play vars 11762 1726853264.82544: variable 'port1_profile' from source: play vars 11762 1726853264.82575: variable 'port1_profile' from source: play vars 11762 1726853264.82588: variable 'dhcp_interface1' from source: play vars 11762 1726853264.82655: variable 'dhcp_interface1' from source: play vars 11762 1726853264.82667: variable 'controller_profile' from source: play vars 11762 1726853264.82734: variable 'controller_profile' from source: play vars 11762 1726853264.82746: variable 'port2_profile' from source: play vars 11762 1726853264.82812: variable 'port2_profile' from source: play vars 11762 1726853264.82825: variable 'dhcp_interface2' from source: play vars 11762 1726853264.82975: variable 'dhcp_interface2' from source: play vars 11762 1726853264.82978: variable 'controller_profile' from source: play vars 11762 1726853264.82980: variable 'controller_profile' from source: play vars 11762 1726853264.83033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853264.83229: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853264.83270: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853264.83310: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853264.83341: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853264.83389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853264.83418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853264.83449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.83482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853264.83561: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853264.83808: variable 'network_connections' from source: include params 11762 1726853264.83819: variable 'controller_profile' from source: play vars 11762 1726853264.83976: variable 'controller_profile' from source: play vars 11762 1726853264.83979: variable 'controller_device' from source: play vars 11762 1726853264.83981: variable 'controller_device' from source: play vars 11762 1726853264.83983: variable 'port1_profile' from source: play vars 11762 1726853264.84034: variable 'port1_profile' from source: play vars 11762 1726853264.84046: variable 'dhcp_interface1' from source: play vars 11762 1726853264.84111: variable 'dhcp_interface1' from source: play vars 11762 1726853264.84123: variable 'controller_profile' from source: play vars 11762 1726853264.84177: variable 'controller_profile' from source: play vars 11762 1726853264.84187: variable 'port2_profile' from source: play vars 11762 1726853264.84245: variable 'port2_profile' from source: play vars 11762 1726853264.84256: variable 'dhcp_interface2' from source: play vars 11762 1726853264.84314: variable 'dhcp_interface2' from source: play vars 11762 1726853264.84326: variable 'controller_profile' from source: play vars 11762 1726853264.84423: variable 'controller_profile' from source: play vars 11762 1726853264.84435: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853264.84443: when evaluation is False, skipping this task 11762 1726853264.84450: _execute() done 11762 1726853264.84456: dumping result to json 11762 1726853264.84462: done dumping result, returning 11762 1726853264.84477: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000282] 11762 1726853264.84487: sending task result for task 02083763-bbaf-d845-03d0-000000000282 11762 1726853264.84876: done sending task result for task 02083763-bbaf-d845-03d0-000000000282 11762 1726853264.84879: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853264.84921: no more pending results, returning what we have 11762 1726853264.84925: results queue empty 11762 1726853264.84926: checking for any_errors_fatal 11762 1726853264.84932: done checking for any_errors_fatal 11762 1726853264.84933: checking for max_fail_percentage 11762 1726853264.84935: done checking for max_fail_percentage 11762 1726853264.84936: checking to see if all hosts have failed and the running result is not ok 11762 1726853264.84936: done checking to see if all hosts have failed 11762 1726853264.84937: getting the remaining hosts for this loop 11762 1726853264.84939: done getting the remaining hosts for this loop 11762 1726853264.84942: getting the next task for host managed_node2 11762 1726853264.84949: done getting next task for host managed_node2 11762 1726853264.84952: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11762 1726853264.84957: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853264.84973: getting variables 11762 1726853264.84974: in VariableManager get_vars() 11762 1726853264.85009: Calling all_inventory to load vars for managed_node2 11762 1726853264.85012: Calling groups_inventory to load vars for managed_node2 11762 1726853264.85014: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853264.85024: Calling all_plugins_play to load vars for managed_node2 11762 1726853264.85027: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853264.85031: Calling groups_plugins_play to load vars for managed_node2 11762 1726853264.86369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853264.88057: done with get_vars() 11762 1726853264.88081: done getting variables 11762 1726853264.88141: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:27:44 -0400 (0:00:00.112) 0:00:15.312 ****** 11762 1726853264.88174: entering _queue_task() for managed_node2/service 11762 1726853264.88486: worker is 1 (out of 1 available) 11762 1726853264.88499: exiting _queue_task() for managed_node2/service 11762 1726853264.88511: done queuing things up, now waiting for results queue to drain 11762 1726853264.88512: waiting for pending results... 11762 1726853264.88790: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11762 1726853264.88943: in run() - task 02083763-bbaf-d845-03d0-000000000283 11762 1726853264.88962: variable 'ansible_search_path' from source: unknown 11762 1726853264.88970: variable 'ansible_search_path' from source: unknown 11762 1726853264.89013: calling self._execute() 11762 1726853264.89108: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853264.89121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853264.89137: variable 'omit' from source: magic vars 11762 1726853264.89508: variable 'ansible_distribution_major_version' from source: facts 11762 1726853264.89562: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853264.89686: variable 'network_provider' from source: set_fact 11762 1726853264.89697: variable 'network_state' from source: role '' defaults 11762 1726853264.89710: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11762 1726853264.89722: variable 'omit' from source: magic vars 11762 1726853264.89786: variable 'omit' from source: magic vars 11762 1726853264.89814: variable 'network_service_name' from source: role '' defaults 11762 1726853264.89885: variable 'network_service_name' from source: role '' defaults 11762 1726853264.89969: variable '__network_provider_setup' from source: role '' defaults 11762 1726853264.89993: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853264.90103: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853264.90118: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853264.90189: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853264.90469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853264.93030: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853264.93113: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853264.93177: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853264.93202: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853264.93230: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853264.93354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.93360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.93394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.93437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.93462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.93513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.93570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.93579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.93621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.93638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.93885: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11762 1726853264.94015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.94075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.94078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.94120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.94140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.94246: variable 'ansible_python' from source: facts 11762 1726853264.94267: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11762 1726853264.94377: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853264.94451: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853264.94678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.94682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.94684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.94686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.94692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.94741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853264.94781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853264.94809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.94843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853264.94861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853264.95115: variable 'network_connections' from source: include params 11762 1726853264.95119: variable 'controller_profile' from source: play vars 11762 1726853264.95130: variable 'controller_profile' from source: play vars 11762 1726853264.95150: variable 'controller_device' from source: play vars 11762 1726853264.95235: variable 'controller_device' from source: play vars 11762 1726853264.95262: variable 'port1_profile' from source: play vars 11762 1726853264.95347: variable 'port1_profile' from source: play vars 11762 1726853264.95364: variable 'dhcp_interface1' from source: play vars 11762 1726853264.95481: variable 'dhcp_interface1' from source: play vars 11762 1726853264.95506: variable 'controller_profile' from source: play vars 11762 1726853264.95590: variable 'controller_profile' from source: play vars 11762 1726853264.95623: variable 'port2_profile' from source: play vars 11762 1726853264.95708: variable 'port2_profile' from source: play vars 11762 1726853264.95726: variable 'dhcp_interface2' from source: play vars 11762 1726853264.95810: variable 'dhcp_interface2' from source: play vars 11762 1726853264.95828: variable 'controller_profile' from source: play vars 11762 1726853264.95910: variable 'controller_profile' from source: play vars 11762 1726853264.96098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853264.96401: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853264.96466: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853264.96510: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853264.96556: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853264.96622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853264.96749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853264.96752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853264.96754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853264.96790: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853264.97089: variable 'network_connections' from source: include params 11762 1726853264.97101: variable 'controller_profile' from source: play vars 11762 1726853264.97182: variable 'controller_profile' from source: play vars 11762 1726853264.97197: variable 'controller_device' from source: play vars 11762 1726853264.97274: variable 'controller_device' from source: play vars 11762 1726853264.97299: variable 'port1_profile' from source: play vars 11762 1726853264.97377: variable 'port1_profile' from source: play vars 11762 1726853264.97400: variable 'dhcp_interface1' from source: play vars 11762 1726853264.97483: variable 'dhcp_interface1' from source: play vars 11762 1726853264.97507: variable 'controller_profile' from source: play vars 11762 1726853264.97587: variable 'controller_profile' from source: play vars 11762 1726853264.97602: variable 'port2_profile' from source: play vars 11762 1726853264.97683: variable 'port2_profile' from source: play vars 11762 1726853264.97723: variable 'dhcp_interface2' from source: play vars 11762 1726853264.97782: variable 'dhcp_interface2' from source: play vars 11762 1726853264.97796: variable 'controller_profile' from source: play vars 11762 1726853264.97879: variable 'controller_profile' from source: play vars 11762 1726853264.98048: variable '__network_packages_default_wireless' from source: role '' defaults 11762 1726853264.98052: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853264.98331: variable 'network_connections' from source: include params 11762 1726853264.98341: variable 'controller_profile' from source: play vars 11762 1726853264.98420: variable 'controller_profile' from source: play vars 11762 1726853264.98433: variable 'controller_device' from source: play vars 11762 1726853264.98512: variable 'controller_device' from source: play vars 11762 1726853264.98528: variable 'port1_profile' from source: play vars 11762 1726853264.98608: variable 'port1_profile' from source: play vars 11762 1726853264.98619: variable 'dhcp_interface1' from source: play vars 11762 1726853264.98694: variable 'dhcp_interface1' from source: play vars 11762 1726853264.98705: variable 'controller_profile' from source: play vars 11762 1726853264.98783: variable 'controller_profile' from source: play vars 11762 1726853264.98795: variable 'port2_profile' from source: play vars 11762 1726853264.98874: variable 'port2_profile' from source: play vars 11762 1726853264.98886: variable 'dhcp_interface2' from source: play vars 11762 1726853264.98962: variable 'dhcp_interface2' from source: play vars 11762 1726853264.99039: variable 'controller_profile' from source: play vars 11762 1726853264.99054: variable 'controller_profile' from source: play vars 11762 1726853264.99085: variable '__network_packages_default_team' from source: role '' defaults 11762 1726853264.99175: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853264.99505: variable 'network_connections' from source: include params 11762 1726853264.99516: variable 'controller_profile' from source: play vars 11762 1726853264.99597: variable 'controller_profile' from source: play vars 11762 1726853264.99608: variable 'controller_device' from source: play vars 11762 1726853264.99682: variable 'controller_device' from source: play vars 11762 1726853264.99704: variable 'port1_profile' from source: play vars 11762 1726853264.99778: variable 'port1_profile' from source: play vars 11762 1726853264.99790: variable 'dhcp_interface1' from source: play vars 11762 1726853264.99912: variable 'dhcp_interface1' from source: play vars 11762 1726853264.99915: variable 'controller_profile' from source: play vars 11762 1726853264.99951: variable 'controller_profile' from source: play vars 11762 1726853264.99963: variable 'port2_profile' from source: play vars 11762 1726853265.00039: variable 'port2_profile' from source: play vars 11762 1726853265.00057: variable 'dhcp_interface2' from source: play vars 11762 1726853265.00122: variable 'dhcp_interface2' from source: play vars 11762 1726853265.00137: variable 'controller_profile' from source: play vars 11762 1726853265.00238: variable 'controller_profile' from source: play vars 11762 1726853265.00274: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853265.00352: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853265.00365: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853265.00431: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853265.00779: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11762 1726853265.01282: variable 'network_connections' from source: include params 11762 1726853265.01292: variable 'controller_profile' from source: play vars 11762 1726853265.01377: variable 'controller_profile' from source: play vars 11762 1726853265.01390: variable 'controller_device' from source: play vars 11762 1726853265.01456: variable 'controller_device' from source: play vars 11762 1726853265.01502: variable 'port1_profile' from source: play vars 11762 1726853265.01643: variable 'port1_profile' from source: play vars 11762 1726853265.01759: variable 'dhcp_interface1' from source: play vars 11762 1726853265.01762: variable 'dhcp_interface1' from source: play vars 11762 1726853265.01764: variable 'controller_profile' from source: play vars 11762 1726853265.01785: variable 'controller_profile' from source: play vars 11762 1726853265.01797: variable 'port2_profile' from source: play vars 11762 1726853265.01861: variable 'port2_profile' from source: play vars 11762 1726853265.01878: variable 'dhcp_interface2' from source: play vars 11762 1726853265.01940: variable 'dhcp_interface2' from source: play vars 11762 1726853265.01954: variable 'controller_profile' from source: play vars 11762 1726853265.02015: variable 'controller_profile' from source: play vars 11762 1726853265.02027: variable 'ansible_distribution' from source: facts 11762 1726853265.02033: variable '__network_rh_distros' from source: role '' defaults 11762 1726853265.02041: variable 'ansible_distribution_major_version' from source: facts 11762 1726853265.02074: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11762 1726853265.02258: variable 'ansible_distribution' from source: facts 11762 1726853265.02267: variable '__network_rh_distros' from source: role '' defaults 11762 1726853265.02278: variable 'ansible_distribution_major_version' from source: facts 11762 1726853265.02310: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11762 1726853265.02479: variable 'ansible_distribution' from source: facts 11762 1726853265.02527: variable '__network_rh_distros' from source: role '' defaults 11762 1726853265.02530: variable 'ansible_distribution_major_version' from source: facts 11762 1726853265.02538: variable 'network_provider' from source: set_fact 11762 1726853265.02567: variable 'omit' from source: magic vars 11762 1726853265.02602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853265.02639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853265.02667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853265.02691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853265.02743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853265.02748: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853265.02751: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853265.02753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853265.02860: Set connection var ansible_timeout to 10 11762 1726853265.02868: Set connection var ansible_shell_type to sh 11762 1726853265.02880: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853265.02889: Set connection var ansible_shell_executable to /bin/sh 11762 1726853265.02899: Set connection var ansible_pipelining to False 11762 1726853265.02909: Set connection var ansible_connection to ssh 11762 1726853265.02960: variable 'ansible_shell_executable' from source: unknown 11762 1726853265.02963: variable 'ansible_connection' from source: unknown 11762 1726853265.02966: variable 'ansible_module_compression' from source: unknown 11762 1726853265.02968: variable 'ansible_shell_type' from source: unknown 11762 1726853265.02969: variable 'ansible_shell_executable' from source: unknown 11762 1726853265.02973: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853265.02975: variable 'ansible_pipelining' from source: unknown 11762 1726853265.02977: variable 'ansible_timeout' from source: unknown 11762 1726853265.03069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853265.03101: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853265.03116: variable 'omit' from source: magic vars 11762 1726853265.03126: starting attempt loop 11762 1726853265.03132: running the handler 11762 1726853265.03218: variable 'ansible_facts' from source: unknown 11762 1726853265.04005: _low_level_execute_command(): starting 11762 1726853265.04018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853265.04798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853265.04856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853265.04908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853265.04979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853265.06732: stdout chunk (state=3): >>>/root <<< 11762 1726853265.06887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853265.06958: stderr chunk (state=3): >>><<< 11762 1726853265.07038: stdout chunk (state=3): >>><<< 11762 1726853265.07290: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853265.07303: _low_level_execute_command(): starting 11762 1726853265.07388: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906 `" && echo ansible-tmp-1726853265.072903-12426-165555220685906="` echo /root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906 `" ) && sleep 0' 11762 1726853265.07930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853265.07937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853265.07950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853265.07965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853265.07979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853265.07986: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853265.07996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853265.08010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853265.08017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853265.08024: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853265.08034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853265.08041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853265.08056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853265.08064: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853265.08070: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853265.08088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853265.08156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853265.08168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853265.08193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853265.08293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853265.10355: stdout chunk (state=3): >>>ansible-tmp-1726853265.072903-12426-165555220685906=/root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906 <<< 11762 1726853265.10569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853265.10601: stderr chunk (state=3): >>><<< 11762 1726853265.10604: stdout chunk (state=3): >>><<< 11762 1726853265.10776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853265.072903-12426-165555220685906=/root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853265.10780: variable 'ansible_module_compression' from source: unknown 11762 1726853265.10784: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11762 1726853265.10787: ANSIBALLZ: Acquiring lock 11762 1726853265.10790: ANSIBALLZ: Lock acquired: 139956166284816 11762 1726853265.10792: ANSIBALLZ: Creating module 11762 1726853265.45815: ANSIBALLZ: Writing module into payload 11762 1726853265.46223: ANSIBALLZ: Writing module 11762 1726853265.46256: ANSIBALLZ: Renaming module 11762 1726853265.46275: ANSIBALLZ: Done creating module 11762 1726853265.46493: variable 'ansible_facts' from source: unknown 11762 1726853265.46825: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906/AnsiballZ_systemd.py 11762 1726853265.47279: Sending initial data 11762 1726853265.47282: Sent initial data (155 bytes) 11762 1726853265.48044: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853265.48057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853265.48127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853265.48177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853265.48193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853265.48215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853265.48322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853265.50049: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853265.50118: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853265.50197: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp03006nws /root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906/AnsiballZ_systemd.py <<< 11762 1726853265.50286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906/AnsiballZ_systemd.py" <<< 11762 1726853265.50315: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp03006nws" to remote "/root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906/AnsiballZ_systemd.py" <<< 11762 1726853265.52149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853265.52207: stderr chunk (state=3): >>><<< 11762 1726853265.52219: stdout chunk (state=3): >>><<< 11762 1726853265.52284: done transferring module to remote 11762 1726853265.52338: _low_level_execute_command(): starting 11762 1726853265.52348: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906/ /root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906/AnsiballZ_systemd.py && sleep 0' 11762 1726853265.53483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853265.53487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853265.53489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853265.53491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853265.53493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853265.53495: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853265.53497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853265.53499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853265.53500: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853265.53502: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853265.53504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853265.53506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853265.53508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853265.53509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853265.53511: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853265.53513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853265.53591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853265.53594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853265.53645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853265.53749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853265.55696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853265.55701: stdout chunk (state=3): >>><<< 11762 1726853265.55707: stderr chunk (state=3): >>><<< 11762 1726853265.55730: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853265.55733: _low_level_execute_command(): starting 11762 1726853265.55738: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906/AnsiballZ_systemd.py && sleep 0' 11762 1726853265.56512: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853265.56607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853265.56694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853265.56718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853265.57114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853265.87967: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "3866624", "MemoryPeak": "4399104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3319619584", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "382963000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 11762 1726853265.87985: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "syste<<< 11762 1726853265.88102: stdout chunk (state=3): >>>m.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11762 1726853265.90179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853265.90183: stdout chunk (state=3): >>><<< 11762 1726853265.90185: stderr chunk (state=3): >>><<< 11762 1726853265.90190: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "3866624", "MemoryPeak": "4399104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3319619584", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "382963000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853265.90264: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853265.90298: _low_level_execute_command(): starting 11762 1726853265.90302: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853265.072903-12426-165555220685906/ > /dev/null 2>&1 && sleep 0' 11762 1726853265.90973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853265.90983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853265.90994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853265.91009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853265.91022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853265.91030: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853265.91048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853265.91060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853265.91069: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853265.91079: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853265.91176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853265.91179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853265.91190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853265.91210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853265.91387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853265.93293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853265.93343: stderr chunk (state=3): >>><<< 11762 1726853265.93347: stdout chunk (state=3): >>><<< 11762 1726853265.93362: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853265.93369: handler run complete 11762 1726853265.93410: attempt loop complete, returning result 11762 1726853265.93414: _execute() done 11762 1726853265.93416: dumping result to json 11762 1726853265.93432: done dumping result, returning 11762 1726853265.93443: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-d845-03d0-000000000283] 11762 1726853265.93450: sending task result for task 02083763-bbaf-d845-03d0-000000000283 11762 1726853265.93681: done sending task result for task 02083763-bbaf-d845-03d0-000000000283 11762 1726853265.93684: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853265.93730: no more pending results, returning what we have 11762 1726853265.93734: results queue empty 11762 1726853265.93735: checking for any_errors_fatal 11762 1726853265.93742: done checking for any_errors_fatal 11762 1726853265.93742: checking for max_fail_percentage 11762 1726853265.93744: done checking for max_fail_percentage 11762 1726853265.93745: checking to see if all hosts have failed and the running result is not ok 11762 1726853265.93746: done checking to see if all hosts have failed 11762 1726853265.93746: getting the remaining hosts for this loop 11762 1726853265.93748: done getting the remaining hosts for this loop 11762 1726853265.93752: getting the next task for host managed_node2 11762 1726853265.93758: done getting next task for host managed_node2 11762 1726853265.93761: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11762 1726853265.93766: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853265.93778: getting variables 11762 1726853265.93780: in VariableManager get_vars() 11762 1726853265.93811: Calling all_inventory to load vars for managed_node2 11762 1726853265.93814: Calling groups_inventory to load vars for managed_node2 11762 1726853265.93816: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853265.93826: Calling all_plugins_play to load vars for managed_node2 11762 1726853265.93828: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853265.93831: Calling groups_plugins_play to load vars for managed_node2 11762 1726853265.94740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853265.96087: done with get_vars() 11762 1726853265.96125: done getting variables 11762 1726853265.96238: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:27:45 -0400 (0:00:01.081) 0:00:16.393 ****** 11762 1726853265.96287: entering _queue_task() for managed_node2/service 11762 1726853265.96648: worker is 1 (out of 1 available) 11762 1726853265.96661: exiting _queue_task() for managed_node2/service 11762 1726853265.96676: done queuing things up, now waiting for results queue to drain 11762 1726853265.96678: waiting for pending results... 11762 1726853265.97038: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11762 1726853265.97139: in run() - task 02083763-bbaf-d845-03d0-000000000284 11762 1726853265.97153: variable 'ansible_search_path' from source: unknown 11762 1726853265.97156: variable 'ansible_search_path' from source: unknown 11762 1726853265.97187: calling self._execute() 11762 1726853265.97257: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853265.97263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853265.97272: variable 'omit' from source: magic vars 11762 1726853265.97557: variable 'ansible_distribution_major_version' from source: facts 11762 1726853265.97568: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853265.97650: variable 'network_provider' from source: set_fact 11762 1726853265.97654: Evaluated conditional (network_provider == "nm"): True 11762 1726853265.97737: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853265.97976: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853265.97995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853266.00935: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853266.01005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853266.01046: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853266.01086: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853266.01116: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853266.01197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853266.01231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853266.01260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853266.01305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853266.01323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853266.01375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853266.01402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853266.01446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853266.01491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853266.01509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853266.01551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853266.01581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853266.01623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853266.01677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853266.01699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853266.01867: variable 'network_connections' from source: include params 11762 1726853266.01891: variable 'controller_profile' from source: play vars 11762 1726853266.01960: variable 'controller_profile' from source: play vars 11762 1726853266.01977: variable 'controller_device' from source: play vars 11762 1726853266.02040: variable 'controller_device' from source: play vars 11762 1726853266.02057: variable 'port1_profile' from source: play vars 11762 1726853266.02127: variable 'port1_profile' from source: play vars 11762 1726853266.02139: variable 'dhcp_interface1' from source: play vars 11762 1726853266.02200: variable 'dhcp_interface1' from source: play vars 11762 1726853266.02211: variable 'controller_profile' from source: play vars 11762 1726853266.02274: variable 'controller_profile' from source: play vars 11762 1726853266.02287: variable 'port2_profile' from source: play vars 11762 1726853266.02377: variable 'port2_profile' from source: play vars 11762 1726853266.02380: variable 'dhcp_interface2' from source: play vars 11762 1726853266.02419: variable 'dhcp_interface2' from source: play vars 11762 1726853266.02430: variable 'controller_profile' from source: play vars 11762 1726853266.02507: variable 'controller_profile' from source: play vars 11762 1726853266.02584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853266.02876: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853266.02879: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853266.02881: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853266.02883: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853266.02915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853266.02947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853266.02982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853266.03022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853266.03102: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853266.03422: variable 'network_connections' from source: include params 11762 1726853266.03433: variable 'controller_profile' from source: play vars 11762 1726853266.03500: variable 'controller_profile' from source: play vars 11762 1726853266.03512: variable 'controller_device' from source: play vars 11762 1726853266.03576: variable 'controller_device' from source: play vars 11762 1726853266.03594: variable 'port1_profile' from source: play vars 11762 1726853266.03654: variable 'port1_profile' from source: play vars 11762 1726853266.03876: variable 'dhcp_interface1' from source: play vars 11762 1726853266.03879: variable 'dhcp_interface1' from source: play vars 11762 1726853266.03881: variable 'controller_profile' from source: play vars 11762 1726853266.04081: variable 'controller_profile' from source: play vars 11762 1726853266.04084: variable 'port2_profile' from source: play vars 11762 1726853266.04086: variable 'port2_profile' from source: play vars 11762 1726853266.04277: variable 'dhcp_interface2' from source: play vars 11762 1726853266.04280: variable 'dhcp_interface2' from source: play vars 11762 1726853266.04283: variable 'controller_profile' from source: play vars 11762 1726853266.04412: variable 'controller_profile' from source: play vars 11762 1726853266.04464: Evaluated conditional (__network_wpa_supplicant_required): False 11762 1726853266.04776: when evaluation is False, skipping this task 11762 1726853266.04780: _execute() done 11762 1726853266.04782: dumping result to json 11762 1726853266.04785: done dumping result, returning 11762 1726853266.04788: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-d845-03d0-000000000284] 11762 1726853266.04791: sending task result for task 02083763-bbaf-d845-03d0-000000000284 11762 1726853266.04865: done sending task result for task 02083763-bbaf-d845-03d0-000000000284 11762 1726853266.04869: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11762 1726853266.04917: no more pending results, returning what we have 11762 1726853266.04920: results queue empty 11762 1726853266.04921: checking for any_errors_fatal 11762 1726853266.04944: done checking for any_errors_fatal 11762 1726853266.04945: checking for max_fail_percentage 11762 1726853266.04947: done checking for max_fail_percentage 11762 1726853266.04947: checking to see if all hosts have failed and the running result is not ok 11762 1726853266.04948: done checking to see if all hosts have failed 11762 1726853266.04949: getting the remaining hosts for this loop 11762 1726853266.04951: done getting the remaining hosts for this loop 11762 1726853266.04954: getting the next task for host managed_node2 11762 1726853266.04961: done getting next task for host managed_node2 11762 1726853266.04964: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11762 1726853266.04968: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853266.04983: getting variables 11762 1726853266.04984: in VariableManager get_vars() 11762 1726853266.05022: Calling all_inventory to load vars for managed_node2 11762 1726853266.05025: Calling groups_inventory to load vars for managed_node2 11762 1726853266.05027: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853266.05036: Calling all_plugins_play to load vars for managed_node2 11762 1726853266.05038: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853266.05041: Calling groups_plugins_play to load vars for managed_node2 11762 1726853266.06638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853266.08702: done with get_vars() 11762 1726853266.08734: done getting variables 11762 1726853266.08807: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:27:46 -0400 (0:00:00.125) 0:00:16.518 ****** 11762 1726853266.08842: entering _queue_task() for managed_node2/service 11762 1726853266.09257: worker is 1 (out of 1 available) 11762 1726853266.09274: exiting _queue_task() for managed_node2/service 11762 1726853266.09286: done queuing things up, now waiting for results queue to drain 11762 1726853266.09288: waiting for pending results... 11762 1726853266.09628: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 11762 1726853266.09814: in run() - task 02083763-bbaf-d845-03d0-000000000285 11762 1726853266.09830: variable 'ansible_search_path' from source: unknown 11762 1726853266.09834: variable 'ansible_search_path' from source: unknown 11762 1726853266.09883: calling self._execute() 11762 1726853266.10041: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853266.10048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853266.10056: variable 'omit' from source: magic vars 11762 1726853266.10539: variable 'ansible_distribution_major_version' from source: facts 11762 1726853266.10545: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853266.10636: variable 'network_provider' from source: set_fact 11762 1726853266.10648: Evaluated conditional (network_provider == "initscripts"): False 11762 1726853266.10651: when evaluation is False, skipping this task 11762 1726853266.10654: _execute() done 11762 1726853266.10657: dumping result to json 11762 1726853266.10660: done dumping result, returning 11762 1726853266.10662: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-d845-03d0-000000000285] 11762 1726853266.10667: sending task result for task 02083763-bbaf-d845-03d0-000000000285 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853266.10862: no more pending results, returning what we have 11762 1726853266.10866: results queue empty 11762 1726853266.10867: checking for any_errors_fatal 11762 1726853266.10880: done checking for any_errors_fatal 11762 1726853266.10881: checking for max_fail_percentage 11762 1726853266.10884: done checking for max_fail_percentage 11762 1726853266.10884: checking to see if all hosts have failed and the running result is not ok 11762 1726853266.10885: done checking to see if all hosts have failed 11762 1726853266.10886: getting the remaining hosts for this loop 11762 1726853266.10888: done getting the remaining hosts for this loop 11762 1726853266.10891: getting the next task for host managed_node2 11762 1726853266.10899: done getting next task for host managed_node2 11762 1726853266.10903: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11762 1726853266.10908: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853266.10925: getting variables 11762 1726853266.10926: in VariableManager get_vars() 11762 1726853266.10965: Calling all_inventory to load vars for managed_node2 11762 1726853266.10968: Calling groups_inventory to load vars for managed_node2 11762 1726853266.10970: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853266.11096: Calling all_plugins_play to load vars for managed_node2 11762 1726853266.11100: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853266.11103: Calling groups_plugins_play to load vars for managed_node2 11762 1726853266.12022: done sending task result for task 02083763-bbaf-d845-03d0-000000000285 11762 1726853266.12026: WORKER PROCESS EXITING 11762 1726853266.12036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853266.13006: done with get_vars() 11762 1726853266.13023: done getting variables 11762 1726853266.13104: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:27:46 -0400 (0:00:00.042) 0:00:16.561 ****** 11762 1726853266.13140: entering _queue_task() for managed_node2/copy 11762 1726853266.13485: worker is 1 (out of 1 available) 11762 1726853266.13499: exiting _queue_task() for managed_node2/copy 11762 1726853266.13513: done queuing things up, now waiting for results queue to drain 11762 1726853266.13515: waiting for pending results... 11762 1726853266.13734: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11762 1726853266.13878: in run() - task 02083763-bbaf-d845-03d0-000000000286 11762 1726853266.13882: variable 'ansible_search_path' from source: unknown 11762 1726853266.13886: variable 'ansible_search_path' from source: unknown 11762 1726853266.13962: calling self._execute() 11762 1726853266.14078: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853266.14087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853266.14114: variable 'omit' from source: magic vars 11762 1726853266.14411: variable 'ansible_distribution_major_version' from source: facts 11762 1726853266.14421: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853266.14502: variable 'network_provider' from source: set_fact 11762 1726853266.14506: Evaluated conditional (network_provider == "initscripts"): False 11762 1726853266.14508: when evaluation is False, skipping this task 11762 1726853266.14511: _execute() done 11762 1726853266.14516: dumping result to json 11762 1726853266.14519: done dumping result, returning 11762 1726853266.14527: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-d845-03d0-000000000286] 11762 1726853266.14531: sending task result for task 02083763-bbaf-d845-03d0-000000000286 11762 1726853266.14622: done sending task result for task 02083763-bbaf-d845-03d0-000000000286 11762 1726853266.14624: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11762 1726853266.14695: no more pending results, returning what we have 11762 1726853266.14699: results queue empty 11762 1726853266.14700: checking for any_errors_fatal 11762 1726853266.14705: done checking for any_errors_fatal 11762 1726853266.14705: checking for max_fail_percentage 11762 1726853266.14707: done checking for max_fail_percentage 11762 1726853266.14707: checking to see if all hosts have failed and the running result is not ok 11762 1726853266.14708: done checking to see if all hosts have failed 11762 1726853266.14709: getting the remaining hosts for this loop 11762 1726853266.14710: done getting the remaining hosts for this loop 11762 1726853266.14713: getting the next task for host managed_node2 11762 1726853266.14720: done getting next task for host managed_node2 11762 1726853266.14723: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11762 1726853266.14728: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853266.14740: getting variables 11762 1726853266.14741: in VariableManager get_vars() 11762 1726853266.14770: Calling all_inventory to load vars for managed_node2 11762 1726853266.14774: Calling groups_inventory to load vars for managed_node2 11762 1726853266.14776: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853266.14784: Calling all_plugins_play to load vars for managed_node2 11762 1726853266.14786: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853266.14789: Calling groups_plugins_play to load vars for managed_node2 11762 1726853266.15504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853266.16540: done with get_vars() 11762 1726853266.16560: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:27:46 -0400 (0:00:00.035) 0:00:16.596 ****** 11762 1726853266.16645: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11762 1726853266.16647: Creating lock for fedora.linux_system_roles.network_connections 11762 1726853266.16952: worker is 1 (out of 1 available) 11762 1726853266.16963: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11762 1726853266.16978: done queuing things up, now waiting for results queue to drain 11762 1726853266.16980: waiting for pending results... 11762 1726853266.17387: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11762 1726853266.17392: in run() - task 02083763-bbaf-d845-03d0-000000000287 11762 1726853266.17395: variable 'ansible_search_path' from source: unknown 11762 1726853266.17397: variable 'ansible_search_path' from source: unknown 11762 1726853266.17430: calling self._execute() 11762 1726853266.17525: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853266.17538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853266.17601: variable 'omit' from source: magic vars 11762 1726853266.17864: variable 'ansible_distribution_major_version' from source: facts 11762 1726853266.17874: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853266.17881: variable 'omit' from source: magic vars 11762 1726853266.17921: variable 'omit' from source: magic vars 11762 1726853266.18030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853266.19569: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853266.19628: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853266.19666: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853266.19699: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853266.19725: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853266.19805: variable 'network_provider' from source: set_fact 11762 1726853266.19925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853266.19953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853266.19979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853266.20014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853266.20042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853266.20116: variable 'omit' from source: magic vars 11762 1726853266.20237: variable 'omit' from source: magic vars 11762 1726853266.20313: variable 'network_connections' from source: include params 11762 1726853266.20331: variable 'controller_profile' from source: play vars 11762 1726853266.20386: variable 'controller_profile' from source: play vars 11762 1726853266.20579: variable 'controller_device' from source: play vars 11762 1726853266.20582: variable 'controller_device' from source: play vars 11762 1726853266.20584: variable 'port1_profile' from source: play vars 11762 1726853266.20586: variable 'port1_profile' from source: play vars 11762 1726853266.20588: variable 'dhcp_interface1' from source: play vars 11762 1726853266.20609: variable 'dhcp_interface1' from source: play vars 11762 1726853266.20620: variable 'controller_profile' from source: play vars 11762 1726853266.20684: variable 'controller_profile' from source: play vars 11762 1726853266.20697: variable 'port2_profile' from source: play vars 11762 1726853266.20759: variable 'port2_profile' from source: play vars 11762 1726853266.20774: variable 'dhcp_interface2' from source: play vars 11762 1726853266.20835: variable 'dhcp_interface2' from source: play vars 11762 1726853266.20850: variable 'controller_profile' from source: play vars 11762 1726853266.20914: variable 'controller_profile' from source: play vars 11762 1726853266.21134: variable 'omit' from source: magic vars 11762 1726853266.21140: variable '__lsr_ansible_managed' from source: task vars 11762 1726853266.21186: variable '__lsr_ansible_managed' from source: task vars 11762 1726853266.21317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11762 1726853266.21693: Loaded config def from plugin (lookup/template) 11762 1726853266.21696: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11762 1726853266.21716: File lookup term: get_ansible_managed.j2 11762 1726853266.21719: variable 'ansible_search_path' from source: unknown 11762 1726853266.21724: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11762 1726853266.21736: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11762 1726853266.21751: variable 'ansible_search_path' from source: unknown 11762 1726853266.25504: variable 'ansible_managed' from source: unknown 11762 1726853266.25579: variable 'omit' from source: magic vars 11762 1726853266.25601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853266.25620: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853266.25633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853266.25648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853266.25654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853266.25678: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853266.25682: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853266.25684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853266.25741: Set connection var ansible_timeout to 10 11762 1726853266.25747: Set connection var ansible_shell_type to sh 11762 1726853266.25749: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853266.25752: Set connection var ansible_shell_executable to /bin/sh 11762 1726853266.25762: Set connection var ansible_pipelining to False 11762 1726853266.25766: Set connection var ansible_connection to ssh 11762 1726853266.25786: variable 'ansible_shell_executable' from source: unknown 11762 1726853266.25789: variable 'ansible_connection' from source: unknown 11762 1726853266.25793: variable 'ansible_module_compression' from source: unknown 11762 1726853266.25799: variable 'ansible_shell_type' from source: unknown 11762 1726853266.25802: variable 'ansible_shell_executable' from source: unknown 11762 1726853266.25804: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853266.25806: variable 'ansible_pipelining' from source: unknown 11762 1726853266.25808: variable 'ansible_timeout' from source: unknown 11762 1726853266.25810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853266.25898: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853266.25907: variable 'omit' from source: magic vars 11762 1726853266.25914: starting attempt loop 11762 1726853266.25917: running the handler 11762 1726853266.25929: _low_level_execute_command(): starting 11762 1726853266.25935: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853266.26453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853266.26456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853266.26459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853266.26461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853266.26516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853266.26520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853266.26524: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853266.26602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853266.28329: stdout chunk (state=3): >>>/root <<< 11762 1726853266.28496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853266.28500: stdout chunk (state=3): >>><<< 11762 1726853266.28502: stderr chunk (state=3): >>><<< 11762 1726853266.28577: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853266.28581: _low_level_execute_command(): starting 11762 1726853266.28584: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418 `" && echo ansible-tmp-1726853266.285311-12486-133339945231418="` echo /root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418 `" ) && sleep 0' 11762 1726853266.29032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853266.29040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853266.29078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853266.29081: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853266.29083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853266.29087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853266.29089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853266.29129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853266.29132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853266.29208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853266.31199: stdout chunk (state=3): >>>ansible-tmp-1726853266.285311-12486-133339945231418=/root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418 <<< 11762 1726853266.31331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853266.31348: stderr chunk (state=3): >>><<< 11762 1726853266.31357: stdout chunk (state=3): >>><<< 11762 1726853266.31429: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853266.285311-12486-133339945231418=/root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853266.31448: variable 'ansible_module_compression' from source: unknown 11762 1726853266.31502: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11762 1726853266.31509: ANSIBALLZ: Acquiring lock 11762 1726853266.31516: ANSIBALLZ: Lock acquired: 139956164010992 11762 1726853266.31523: ANSIBALLZ: Creating module 11762 1726853266.62551: ANSIBALLZ: Writing module into payload 11762 1726853266.62957: ANSIBALLZ: Writing module 11762 1726853266.62961: ANSIBALLZ: Renaming module 11762 1726853266.62964: ANSIBALLZ: Done creating module 11762 1726853266.63040: variable 'ansible_facts' from source: unknown 11762 1726853266.63378: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418/AnsiballZ_network_connections.py 11762 1726853266.63383: Sending initial data 11762 1726853266.63385: Sent initial data (167 bytes) 11762 1726853266.64530: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853266.64607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853266.64628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853266.64644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853266.64713: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853266.64750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853266.64766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853266.64793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853266.64930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853266.66588: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853266.66692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853266.66858: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpofanxmav /root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418/AnsiballZ_network_connections.py <<< 11762 1726853266.66862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418/AnsiballZ_network_connections.py" <<< 11762 1726853266.66965: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpofanxmav" to remote "/root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418/AnsiballZ_network_connections.py" <<< 11762 1726853266.68764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853266.68768: stderr chunk (state=3): >>><<< 11762 1726853266.68773: stdout chunk (state=3): >>><<< 11762 1726853266.68775: done transferring module to remote 11762 1726853266.68777: _low_level_execute_command(): starting 11762 1726853266.68779: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418/ /root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418/AnsiballZ_network_connections.py && sleep 0' 11762 1726853266.69515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853266.69529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853266.69548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853266.69567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853266.69587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853266.69684: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853266.69696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853266.69798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853266.71878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853266.71882: stdout chunk (state=3): >>><<< 11762 1726853266.71885: stderr chunk (state=3): >>><<< 11762 1726853266.71887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853266.71889: _low_level_execute_command(): starting 11762 1726853266.71891: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418/AnsiballZ_network_connections.py && sleep 0' 11762 1726853266.72595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853266.72610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853266.72675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853266.72679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853266.72681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853266.72683: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853266.72803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853266.72810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853266.72867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853266.72926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853267.23126: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11762 1726853267.25412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853267.25437: stderr chunk (state=3): >>><<< 11762 1726853267.25440: stdout chunk (state=3): >>><<< 11762 1726853267.25459: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853267.25515: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': '802.3ad', 'ad_actor_sys_prio': 65535, 'ad_actor_system': '00:00:5e:00:53:5d', 'ad_select': 'stable', 'ad_user_port_key': 1023, 'all_ports_active': True, 'downdelay': 0, 'lacp_rate': 'slow', 'lp_interval': 128, 'miimon': 110, 'min_links': 0, 'num_grat_arp': 64, 'primary_reselect': 'better', 'resend_igmp': 225, 'updelay': 0, 'use_carrier': True, 'xmit_hash_policy': 'encap2+3'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853267.25522: _low_level_execute_command(): starting 11762 1726853267.25528: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853266.285311-12486-133339945231418/ > /dev/null 2>&1 && sleep 0' 11762 1726853267.25933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853267.25967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853267.25970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853267.25975: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853267.25977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853267.26023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853267.26030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853267.26032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853267.26101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853267.28276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853267.28280: stdout chunk (state=3): >>><<< 11762 1726853267.28282: stderr chunk (state=3): >>><<< 11762 1726853267.28284: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853267.28286: handler run complete 11762 1726853267.28288: attempt loop complete, returning result 11762 1726853267.28290: _execute() done 11762 1726853267.28292: dumping result to json 11762 1726853267.28294: done dumping result, returning 11762 1726853267.28296: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-d845-03d0-000000000287] 11762 1726853267.28298: sending task result for task 02083763-bbaf-d845-03d0-000000000287 11762 1726853267.28383: done sending task result for task 02083763-bbaf-d845-03d0-000000000287 11762 1726853267.28386: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c (not-active) 11762 1726853267.28533: no more pending results, returning what we have 11762 1726853267.28537: results queue empty 11762 1726853267.28537: checking for any_errors_fatal 11762 1726853267.28546: done checking for any_errors_fatal 11762 1726853267.28547: checking for max_fail_percentage 11762 1726853267.28548: done checking for max_fail_percentage 11762 1726853267.28549: checking to see if all hosts have failed and the running result is not ok 11762 1726853267.28550: done checking to see if all hosts have failed 11762 1726853267.28550: getting the remaining hosts for this loop 11762 1726853267.28552: done getting the remaining hosts for this loop 11762 1726853267.28555: getting the next task for host managed_node2 11762 1726853267.28561: done getting next task for host managed_node2 11762 1726853267.28564: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11762 1726853267.28568: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853267.28686: getting variables 11762 1726853267.28688: in VariableManager get_vars() 11762 1726853267.28720: Calling all_inventory to load vars for managed_node2 11762 1726853267.28722: Calling groups_inventory to load vars for managed_node2 11762 1726853267.28725: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853267.28733: Calling all_plugins_play to load vars for managed_node2 11762 1726853267.28735: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853267.28738: Calling groups_plugins_play to load vars for managed_node2 11762 1726853267.31334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853267.33958: done with get_vars() 11762 1726853267.33991: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:27:47 -0400 (0:00:01.178) 0:00:17.775 ****** 11762 1726853267.34467: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11762 1726853267.34469: Creating lock for fedora.linux_system_roles.network_state 11762 1726853267.35000: worker is 1 (out of 1 available) 11762 1726853267.35013: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11762 1726853267.35027: done queuing things up, now waiting for results queue to drain 11762 1726853267.35028: waiting for pending results... 11762 1726853267.35836: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 11762 1726853267.35914: in run() - task 02083763-bbaf-d845-03d0-000000000288 11762 1726853267.35930: variable 'ansible_search_path' from source: unknown 11762 1726853267.35935: variable 'ansible_search_path' from source: unknown 11762 1726853267.35969: calling self._execute() 11762 1726853267.36280: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.36287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.36297: variable 'omit' from source: magic vars 11762 1726853267.37024: variable 'ansible_distribution_major_version' from source: facts 11762 1726853267.37072: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853267.37303: variable 'network_state' from source: role '' defaults 11762 1726853267.37313: Evaluated conditional (network_state != {}): False 11762 1726853267.37317: when evaluation is False, skipping this task 11762 1726853267.37319: _execute() done 11762 1726853267.37322: dumping result to json 11762 1726853267.37325: done dumping result, returning 11762 1726853267.37332: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-d845-03d0-000000000288] 11762 1726853267.37347: sending task result for task 02083763-bbaf-d845-03d0-000000000288 11762 1726853267.37548: done sending task result for task 02083763-bbaf-d845-03d0-000000000288 11762 1726853267.37551: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853267.37611: no more pending results, returning what we have 11762 1726853267.37615: results queue empty 11762 1726853267.37616: checking for any_errors_fatal 11762 1726853267.37635: done checking for any_errors_fatal 11762 1726853267.37636: checking for max_fail_percentage 11762 1726853267.37639: done checking for max_fail_percentage 11762 1726853267.37639: checking to see if all hosts have failed and the running result is not ok 11762 1726853267.37640: done checking to see if all hosts have failed 11762 1726853267.37641: getting the remaining hosts for this loop 11762 1726853267.37646: done getting the remaining hosts for this loop 11762 1726853267.37650: getting the next task for host managed_node2 11762 1726853267.37658: done getting next task for host managed_node2 11762 1726853267.37662: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11762 1726853267.37669: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853267.37689: getting variables 11762 1726853267.37691: in VariableManager get_vars() 11762 1726853267.37728: Calling all_inventory to load vars for managed_node2 11762 1726853267.37731: Calling groups_inventory to load vars for managed_node2 11762 1726853267.37733: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853267.37746: Calling all_plugins_play to load vars for managed_node2 11762 1726853267.37749: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853267.37751: Calling groups_plugins_play to load vars for managed_node2 11762 1726853267.39909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853267.41929: done with get_vars() 11762 1726853267.41959: done getting variables 11762 1726853267.42025: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:27:47 -0400 (0:00:00.075) 0:00:17.851 ****** 11762 1726853267.42062: entering _queue_task() for managed_node2/debug 11762 1726853267.42582: worker is 1 (out of 1 available) 11762 1726853267.42593: exiting _queue_task() for managed_node2/debug 11762 1726853267.42603: done queuing things up, now waiting for results queue to drain 11762 1726853267.42604: waiting for pending results... 11762 1726853267.42902: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11762 1726853267.42907: in run() - task 02083763-bbaf-d845-03d0-000000000289 11762 1726853267.42910: variable 'ansible_search_path' from source: unknown 11762 1726853267.42912: variable 'ansible_search_path' from source: unknown 11762 1726853267.42928: calling self._execute() 11762 1726853267.43030: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.43041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.43054: variable 'omit' from source: magic vars 11762 1726853267.43537: variable 'ansible_distribution_major_version' from source: facts 11762 1726853267.43557: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853267.43568: variable 'omit' from source: magic vars 11762 1726853267.43634: variable 'omit' from source: magic vars 11762 1726853267.43681: variable 'omit' from source: magic vars 11762 1726853267.43725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853267.43772: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853267.43797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853267.43818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853267.43866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853267.43874: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853267.43882: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.43890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.43996: Set connection var ansible_timeout to 10 11762 1726853267.44004: Set connection var ansible_shell_type to sh 11762 1726853267.44014: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853267.44077: Set connection var ansible_shell_executable to /bin/sh 11762 1726853267.44084: Set connection var ansible_pipelining to False 11762 1726853267.44087: Set connection var ansible_connection to ssh 11762 1726853267.44089: variable 'ansible_shell_executable' from source: unknown 11762 1726853267.44090: variable 'ansible_connection' from source: unknown 11762 1726853267.44093: variable 'ansible_module_compression' from source: unknown 11762 1726853267.44095: variable 'ansible_shell_type' from source: unknown 11762 1726853267.44097: variable 'ansible_shell_executable' from source: unknown 11762 1726853267.44098: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.44100: variable 'ansible_pipelining' from source: unknown 11762 1726853267.44102: variable 'ansible_timeout' from source: unknown 11762 1726853267.44109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.44255: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853267.44273: variable 'omit' from source: magic vars 11762 1726853267.44283: starting attempt loop 11762 1726853267.44293: running the handler 11762 1726853267.44435: variable '__network_connections_result' from source: set_fact 11762 1726853267.44518: handler run complete 11762 1726853267.44624: attempt loop complete, returning result 11762 1726853267.44627: _execute() done 11762 1726853267.44630: dumping result to json 11762 1726853267.44632: done dumping result, returning 11762 1726853267.44634: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-d845-03d0-000000000289] 11762 1726853267.44636: sending task result for task 02083763-bbaf-d845-03d0-000000000289 11762 1726853267.44708: done sending task result for task 02083763-bbaf-d845-03d0-000000000289 11762 1726853267.44711: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c (not-active)" ] } 11762 1726853267.44795: no more pending results, returning what we have 11762 1726853267.44799: results queue empty 11762 1726853267.44800: checking for any_errors_fatal 11762 1726853267.44807: done checking for any_errors_fatal 11762 1726853267.44808: checking for max_fail_percentage 11762 1726853267.44810: done checking for max_fail_percentage 11762 1726853267.44811: checking to see if all hosts have failed and the running result is not ok 11762 1726853267.44811: done checking to see if all hosts have failed 11762 1726853267.44812: getting the remaining hosts for this loop 11762 1726853267.44814: done getting the remaining hosts for this loop 11762 1726853267.44817: getting the next task for host managed_node2 11762 1726853267.44824: done getting next task for host managed_node2 11762 1726853267.44827: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11762 1726853267.44832: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853267.44842: getting variables 11762 1726853267.44844: in VariableManager get_vars() 11762 1726853267.44880: Calling all_inventory to load vars for managed_node2 11762 1726853267.44883: Calling groups_inventory to load vars for managed_node2 11762 1726853267.44885: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853267.44895: Calling all_plugins_play to load vars for managed_node2 11762 1726853267.44902: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853267.44905: Calling groups_plugins_play to load vars for managed_node2 11762 1726853267.46909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853267.49512: done with get_vars() 11762 1726853267.49539: done getting variables 11762 1726853267.49714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:27:47 -0400 (0:00:00.076) 0:00:17.927 ****** 11762 1726853267.49748: entering _queue_task() for managed_node2/debug 11762 1726853267.50589: worker is 1 (out of 1 available) 11762 1726853267.50601: exiting _queue_task() for managed_node2/debug 11762 1726853267.50613: done queuing things up, now waiting for results queue to drain 11762 1726853267.50615: waiting for pending results... 11762 1726853267.51443: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11762 1726853267.51853: in run() - task 02083763-bbaf-d845-03d0-00000000028a 11762 1726853267.51875: variable 'ansible_search_path' from source: unknown 11762 1726853267.51879: variable 'ansible_search_path' from source: unknown 11762 1726853267.51927: calling self._execute() 11762 1726853267.52293: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.52298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.52305: variable 'omit' from source: magic vars 11762 1726853267.53698: variable 'ansible_distribution_major_version' from source: facts 11762 1726853267.53710: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853267.53717: variable 'omit' from source: magic vars 11762 1726853267.53785: variable 'omit' from source: magic vars 11762 1726853267.53820: variable 'omit' from source: magic vars 11762 1726853267.53860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853267.54251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853267.54294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853267.54324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853267.54340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853267.54386: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853267.54426: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.54435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.54640: Set connection var ansible_timeout to 10 11762 1726853267.54643: Set connection var ansible_shell_type to sh 11762 1726853267.54646: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853267.54648: Set connection var ansible_shell_executable to /bin/sh 11762 1726853267.54650: Set connection var ansible_pipelining to False 11762 1726853267.54652: Set connection var ansible_connection to ssh 11762 1726853267.54654: variable 'ansible_shell_executable' from source: unknown 11762 1726853267.54656: variable 'ansible_connection' from source: unknown 11762 1726853267.54658: variable 'ansible_module_compression' from source: unknown 11762 1726853267.54660: variable 'ansible_shell_type' from source: unknown 11762 1726853267.54662: variable 'ansible_shell_executable' from source: unknown 11762 1726853267.54663: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.54665: variable 'ansible_pipelining' from source: unknown 11762 1726853267.54667: variable 'ansible_timeout' from source: unknown 11762 1726853267.54669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.54815: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853267.54831: variable 'omit' from source: magic vars 11762 1726853267.54842: starting attempt loop 11762 1726853267.54853: running the handler 11762 1726853267.54911: variable '__network_connections_result' from source: set_fact 11762 1726853267.55005: variable '__network_connections_result' from source: set_fact 11762 1726853267.55232: handler run complete 11762 1726853267.55276: attempt loop complete, returning result 11762 1726853267.55376: _execute() done 11762 1726853267.55379: dumping result to json 11762 1726853267.55381: done dumping result, returning 11762 1726853267.55384: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-d845-03d0-00000000028a] 11762 1726853267.55386: sending task result for task 02083763-bbaf-d845-03d0-00000000028a 11762 1726853267.55469: done sending task result for task 02083763-bbaf-d845-03d0-00000000028a 11762 1726853267.55475: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c (not-active)" ] } } 11762 1726853267.55602: no more pending results, returning what we have 11762 1726853267.55607: results queue empty 11762 1726853267.55607: checking for any_errors_fatal 11762 1726853267.55613: done checking for any_errors_fatal 11762 1726853267.55614: checking for max_fail_percentage 11762 1726853267.55616: done checking for max_fail_percentage 11762 1726853267.55616: checking to see if all hosts have failed and the running result is not ok 11762 1726853267.55617: done checking to see if all hosts have failed 11762 1726853267.55618: getting the remaining hosts for this loop 11762 1726853267.55620: done getting the remaining hosts for this loop 11762 1726853267.55623: getting the next task for host managed_node2 11762 1726853267.55631: done getting next task for host managed_node2 11762 1726853267.55635: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11762 1726853267.55640: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853267.55651: getting variables 11762 1726853267.55653: in VariableManager get_vars() 11762 1726853267.55891: Calling all_inventory to load vars for managed_node2 11762 1726853267.55894: Calling groups_inventory to load vars for managed_node2 11762 1726853267.55896: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853267.55905: Calling all_plugins_play to load vars for managed_node2 11762 1726853267.55907: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853267.55910: Calling groups_plugins_play to load vars for managed_node2 11762 1726853267.57990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853267.59594: done with get_vars() 11762 1726853267.59622: done getting variables 11762 1726853267.59684: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:27:47 -0400 (0:00:00.100) 0:00:18.027 ****** 11762 1726853267.59759: entering _queue_task() for managed_node2/debug 11762 1726853267.60375: worker is 1 (out of 1 available) 11762 1726853267.60383: exiting _queue_task() for managed_node2/debug 11762 1726853267.60394: done queuing things up, now waiting for results queue to drain 11762 1726853267.60396: waiting for pending results... 11762 1726853267.60744: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11762 1726853267.60760: in run() - task 02083763-bbaf-d845-03d0-00000000028b 11762 1726853267.60764: variable 'ansible_search_path' from source: unknown 11762 1726853267.60767: variable 'ansible_search_path' from source: unknown 11762 1726853267.60770: calling self._execute() 11762 1726853267.60863: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.60878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.60893: variable 'omit' from source: magic vars 11762 1726853267.61337: variable 'ansible_distribution_major_version' from source: facts 11762 1726853267.61353: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853267.61578: variable 'network_state' from source: role '' defaults 11762 1726853267.61585: Evaluated conditional (network_state != {}): False 11762 1726853267.61587: when evaluation is False, skipping this task 11762 1726853267.61589: _execute() done 11762 1726853267.61591: dumping result to json 11762 1726853267.61594: done dumping result, returning 11762 1726853267.61596: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-d845-03d0-00000000028b] 11762 1726853267.61598: sending task result for task 02083763-bbaf-d845-03d0-00000000028b 11762 1726853267.61664: done sending task result for task 02083763-bbaf-d845-03d0-00000000028b 11762 1726853267.61668: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 11762 1726853267.61732: no more pending results, returning what we have 11762 1726853267.61737: results queue empty 11762 1726853267.61738: checking for any_errors_fatal 11762 1726853267.61751: done checking for any_errors_fatal 11762 1726853267.61752: checking for max_fail_percentage 11762 1726853267.61754: done checking for max_fail_percentage 11762 1726853267.61755: checking to see if all hosts have failed and the running result is not ok 11762 1726853267.61756: done checking to see if all hosts have failed 11762 1726853267.61756: getting the remaining hosts for this loop 11762 1726853267.61758: done getting the remaining hosts for this loop 11762 1726853267.61762: getting the next task for host managed_node2 11762 1726853267.61772: done getting next task for host managed_node2 11762 1726853267.61777: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11762 1726853267.61783: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853267.61798: getting variables 11762 1726853267.61799: in VariableManager get_vars() 11762 1726853267.61839: Calling all_inventory to load vars for managed_node2 11762 1726853267.61842: Calling groups_inventory to load vars for managed_node2 11762 1726853267.61845: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853267.61857: Calling all_plugins_play to load vars for managed_node2 11762 1726853267.61861: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853267.61864: Calling groups_plugins_play to load vars for managed_node2 11762 1726853267.62862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853267.63736: done with get_vars() 11762 1726853267.63766: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:27:47 -0400 (0:00:00.041) 0:00:18.069 ****** 11762 1726853267.63869: entering _queue_task() for managed_node2/ping 11762 1726853267.63874: Creating lock for ping 11762 1726853267.64221: worker is 1 (out of 1 available) 11762 1726853267.64233: exiting _queue_task() for managed_node2/ping 11762 1726853267.64248: done queuing things up, now waiting for results queue to drain 11762 1726853267.64249: waiting for pending results... 11762 1726853267.64598: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 11762 1726853267.64683: in run() - task 02083763-bbaf-d845-03d0-00000000028c 11762 1726853267.64756: variable 'ansible_search_path' from source: unknown 11762 1726853267.64760: variable 'ansible_search_path' from source: unknown 11762 1726853267.64764: calling self._execute() 11762 1726853267.64850: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.64854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.64857: variable 'omit' from source: magic vars 11762 1726853267.65192: variable 'ansible_distribution_major_version' from source: facts 11762 1726853267.65201: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853267.65207: variable 'omit' from source: magic vars 11762 1726853267.65251: variable 'omit' from source: magic vars 11762 1726853267.65279: variable 'omit' from source: magic vars 11762 1726853267.65307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853267.65335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853267.65353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853267.65369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853267.65380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853267.65403: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853267.65406: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.65409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.65479: Set connection var ansible_timeout to 10 11762 1726853267.65483: Set connection var ansible_shell_type to sh 11762 1726853267.65487: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853267.65497: Set connection var ansible_shell_executable to /bin/sh 11762 1726853267.65499: Set connection var ansible_pipelining to False 11762 1726853267.65506: Set connection var ansible_connection to ssh 11762 1726853267.65523: variable 'ansible_shell_executable' from source: unknown 11762 1726853267.65526: variable 'ansible_connection' from source: unknown 11762 1726853267.65529: variable 'ansible_module_compression' from source: unknown 11762 1726853267.65531: variable 'ansible_shell_type' from source: unknown 11762 1726853267.65533: variable 'ansible_shell_executable' from source: unknown 11762 1726853267.65535: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853267.65538: variable 'ansible_pipelining' from source: unknown 11762 1726853267.65541: variable 'ansible_timeout' from source: unknown 11762 1726853267.65548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853267.65707: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853267.65717: variable 'omit' from source: magic vars 11762 1726853267.65720: starting attempt loop 11762 1726853267.65722: running the handler 11762 1726853267.65734: _low_level_execute_command(): starting 11762 1726853267.65741: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853267.66263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853267.66266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853267.66274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853267.66278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853267.66314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853267.66317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853267.66323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853267.66397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853267.68131: stdout chunk (state=3): >>>/root <<< 11762 1726853267.68357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853267.68390: stderr chunk (state=3): >>><<< 11762 1726853267.68392: stdout chunk (state=3): >>><<< 11762 1726853267.68404: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853267.68454: _low_level_execute_command(): starting 11762 1726853267.68459: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193 `" && echo ansible-tmp-1726853267.6840868-12563-272860871258193="` echo /root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193 `" ) && sleep 0' 11762 1726853267.68842: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853267.68848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853267.68857: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853267.68860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853267.68862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853267.68910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853267.68913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853267.68981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853267.70958: stdout chunk (state=3): >>>ansible-tmp-1726853267.6840868-12563-272860871258193=/root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193 <<< 11762 1726853267.71103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853267.71126: stderr chunk (state=3): >>><<< 11762 1726853267.71129: stdout chunk (state=3): >>><<< 11762 1726853267.71145: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853267.6840868-12563-272860871258193=/root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853267.71188: variable 'ansible_module_compression' from source: unknown 11762 1726853267.71220: ANSIBALLZ: Using lock for ping 11762 1726853267.71223: ANSIBALLZ: Acquiring lock 11762 1726853267.71226: ANSIBALLZ: Lock acquired: 139956162150912 11762 1726853267.71228: ANSIBALLZ: Creating module 11762 1726853267.79468: ANSIBALLZ: Writing module into payload 11762 1726853267.79508: ANSIBALLZ: Writing module 11762 1726853267.79528: ANSIBALLZ: Renaming module 11762 1726853267.79532: ANSIBALLZ: Done creating module 11762 1726853267.79551: variable 'ansible_facts' from source: unknown 11762 1726853267.79596: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193/AnsiballZ_ping.py 11762 1726853267.79698: Sending initial data 11762 1726853267.79701: Sent initial data (153 bytes) 11762 1726853267.80156: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853267.80159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853267.80162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853267.80164: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853267.80166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853267.80170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853267.80221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853267.80225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853267.80229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853267.80307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853267.82073: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853267.82142: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853267.82217: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpxk47ccqf /root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193/AnsiballZ_ping.py <<< 11762 1726853267.82221: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193/AnsiballZ_ping.py" <<< 11762 1726853267.82301: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpxk47ccqf" to remote "/root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193/AnsiballZ_ping.py" <<< 11762 1726853267.83256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853267.83260: stdout chunk (state=3): >>><<< 11762 1726853267.83262: stderr chunk (state=3): >>><<< 11762 1726853267.83264: done transferring module to remote 11762 1726853267.83266: _low_level_execute_command(): starting 11762 1726853267.83268: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193/ /root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193/AnsiballZ_ping.py && sleep 0' 11762 1726853267.83807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853267.83828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853267.83845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853267.83865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853267.83891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853267.83946: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853267.84006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853267.84045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853267.84134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853267.86065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853267.86077: stdout chunk (state=3): >>><<< 11762 1726853267.86088: stderr chunk (state=3): >>><<< 11762 1726853267.86164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853267.86167: _low_level_execute_command(): starting 11762 1726853267.86170: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193/AnsiballZ_ping.py && sleep 0' 11762 1726853267.86748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853267.86762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853267.86776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853267.86793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853267.86808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853267.86833: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853267.86885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853267.86938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853267.86957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853267.86978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853267.87083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853268.02569: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11762 1726853268.04088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853268.04092: stdout chunk (state=3): >>><<< 11762 1726853268.04095: stderr chunk (state=3): >>><<< 11762 1726853268.04112: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853268.04142: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853268.04242: _low_level_execute_command(): starting 11762 1726853268.04251: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853267.6840868-12563-272860871258193/ > /dev/null 2>&1 && sleep 0' 11762 1726853268.05379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853268.05392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853268.05405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853268.05421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853268.05437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853268.05591: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853268.05690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853268.05858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853268.07832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853268.07849: stdout chunk (state=3): >>><<< 11762 1726853268.07868: stderr chunk (state=3): >>><<< 11762 1726853268.07894: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853268.07906: handler run complete 11762 1726853268.07927: attempt loop complete, returning result 11762 1726853268.07934: _execute() done 11762 1726853268.07950: dumping result to json 11762 1726853268.08082: done dumping result, returning 11762 1726853268.08085: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-d845-03d0-00000000028c] 11762 1726853268.08088: sending task result for task 02083763-bbaf-d845-03d0-00000000028c 11762 1726853268.08162: done sending task result for task 02083763-bbaf-d845-03d0-00000000028c 11762 1726853268.08166: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 11762 1726853268.08338: no more pending results, returning what we have 11762 1726853268.08346: results queue empty 11762 1726853268.08347: checking for any_errors_fatal 11762 1726853268.08354: done checking for any_errors_fatal 11762 1726853268.08354: checking for max_fail_percentage 11762 1726853268.08357: done checking for max_fail_percentage 11762 1726853268.08358: checking to see if all hosts have failed and the running result is not ok 11762 1726853268.08358: done checking to see if all hosts have failed 11762 1726853268.08359: getting the remaining hosts for this loop 11762 1726853268.08361: done getting the remaining hosts for this loop 11762 1726853268.08365: getting the next task for host managed_node2 11762 1726853268.08583: done getting next task for host managed_node2 11762 1726853268.08586: ^ task is: TASK: meta (role_complete) 11762 1726853268.08592: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853268.08604: getting variables 11762 1726853268.08606: in VariableManager get_vars() 11762 1726853268.08652: Calling all_inventory to load vars for managed_node2 11762 1726853268.08655: Calling groups_inventory to load vars for managed_node2 11762 1726853268.08658: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853268.08669: Calling all_plugins_play to load vars for managed_node2 11762 1726853268.08875: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853268.08881: Calling groups_plugins_play to load vars for managed_node2 11762 1726853268.11387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853268.13839: done with get_vars() 11762 1726853268.13869: done getting variables 11762 1726853268.14045: done queuing things up, now waiting for results queue to drain 11762 1726853268.14048: results queue empty 11762 1726853268.14049: checking for any_errors_fatal 11762 1726853268.14057: done checking for any_errors_fatal 11762 1726853268.14058: checking for max_fail_percentage 11762 1726853268.14060: done checking for max_fail_percentage 11762 1726853268.14060: checking to see if all hosts have failed and the running result is not ok 11762 1726853268.14061: done checking to see if all hosts have failed 11762 1726853268.14062: getting the remaining hosts for this loop 11762 1726853268.14063: done getting the remaining hosts for this loop 11762 1726853268.14066: getting the next task for host managed_node2 11762 1726853268.14074: done getting next task for host managed_node2 11762 1726853268.14076: ^ task is: TASK: Show result 11762 1726853268.14079: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853268.14082: getting variables 11762 1726853268.14083: in VariableManager get_vars() 11762 1726853268.14093: Calling all_inventory to load vars for managed_node2 11762 1726853268.14095: Calling groups_inventory to load vars for managed_node2 11762 1726853268.14098: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853268.14103: Calling all_plugins_play to load vars for managed_node2 11762 1726853268.14106: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853268.14108: Calling groups_plugins_play to load vars for managed_node2 11762 1726853268.16159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853268.17766: done with get_vars() 11762 1726853268.17858: done getting variables 11762 1726853268.17913: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:46 Friday 20 September 2024 13:27:48 -0400 (0:00:00.540) 0:00:18.609 ****** 11762 1726853268.17953: entering _queue_task() for managed_node2/debug 11762 1726853268.18398: worker is 1 (out of 1 available) 11762 1726853268.18410: exiting _queue_task() for managed_node2/debug 11762 1726853268.18423: done queuing things up, now waiting for results queue to drain 11762 1726853268.18424: waiting for pending results... 11762 1726853268.18899: running TaskExecutor() for managed_node2/TASK: Show result 11762 1726853268.18903: in run() - task 02083763-bbaf-d845-03d0-0000000001c6 11762 1726853268.18907: variable 'ansible_search_path' from source: unknown 11762 1726853268.18910: variable 'ansible_search_path' from source: unknown 11762 1726853268.18912: calling self._execute() 11762 1726853268.19006: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.19023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.19040: variable 'omit' from source: magic vars 11762 1726853268.19547: variable 'ansible_distribution_major_version' from source: facts 11762 1726853268.19563: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853268.19576: variable 'omit' from source: magic vars 11762 1726853268.19630: variable 'omit' from source: magic vars 11762 1726853268.19689: variable 'omit' from source: magic vars 11762 1726853268.19732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853268.19864: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853268.19867: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853268.19870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853268.19874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853268.19879: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853268.19887: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.19894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.20005: Set connection var ansible_timeout to 10 11762 1726853268.20013: Set connection var ansible_shell_type to sh 11762 1726853268.20022: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853268.20030: Set connection var ansible_shell_executable to /bin/sh 11762 1726853268.20041: Set connection var ansible_pipelining to False 11762 1726853268.20055: Set connection var ansible_connection to ssh 11762 1726853268.20088: variable 'ansible_shell_executable' from source: unknown 11762 1726853268.20096: variable 'ansible_connection' from source: unknown 11762 1726853268.20103: variable 'ansible_module_compression' from source: unknown 11762 1726853268.20109: variable 'ansible_shell_type' from source: unknown 11762 1726853268.20114: variable 'ansible_shell_executable' from source: unknown 11762 1726853268.20120: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.20127: variable 'ansible_pipelining' from source: unknown 11762 1726853268.20134: variable 'ansible_timeout' from source: unknown 11762 1726853268.20188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.20299: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853268.20316: variable 'omit' from source: magic vars 11762 1726853268.20324: starting attempt loop 11762 1726853268.20331: running the handler 11762 1726853268.20384: variable '__network_connections_result' from source: set_fact 11762 1726853268.20475: variable '__network_connections_result' from source: set_fact 11762 1726853268.20702: handler run complete 11762 1726853268.20841: attempt loop complete, returning result 11762 1726853268.20847: _execute() done 11762 1726853268.20849: dumping result to json 11762 1726853268.20851: done dumping result, returning 11762 1726853268.20854: done running TaskExecutor() for managed_node2/TASK: Show result [02083763-bbaf-d845-03d0-0000000001c6] 11762 1726853268.20856: sending task result for task 02083763-bbaf-d845-03d0-0000000001c6 11762 1726853268.20936: done sending task result for task 02083763-bbaf-d845-03d0-0000000001c6 11762 1726853268.20939: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 131ea31c-3b61-4971-a483-9ea88268ee14 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 09dac36d-59ab-47a3-bd65-311b47a40724 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1e65cb07-7038-41b3-8603-a8db12da667c (not-active)" ] } } 11762 1726853268.21287: no more pending results, returning what we have 11762 1726853268.21291: results queue empty 11762 1726853268.21292: checking for any_errors_fatal 11762 1726853268.21294: done checking for any_errors_fatal 11762 1726853268.21295: checking for max_fail_percentage 11762 1726853268.21297: done checking for max_fail_percentage 11762 1726853268.21298: checking to see if all hosts have failed and the running result is not ok 11762 1726853268.21298: done checking to see if all hosts have failed 11762 1726853268.21299: getting the remaining hosts for this loop 11762 1726853268.21301: done getting the remaining hosts for this loop 11762 1726853268.21304: getting the next task for host managed_node2 11762 1726853268.21313: done getting next task for host managed_node2 11762 1726853268.21317: ^ task is: TASK: Asserts 11762 1726853268.21319: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853268.21325: getting variables 11762 1726853268.21327: in VariableManager get_vars() 11762 1726853268.21361: Calling all_inventory to load vars for managed_node2 11762 1726853268.21364: Calling groups_inventory to load vars for managed_node2 11762 1726853268.21368: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853268.21402: Calling all_plugins_play to load vars for managed_node2 11762 1726853268.21406: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853268.21409: Calling groups_plugins_play to load vars for managed_node2 11762 1726853268.23047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853268.24590: done with get_vars() 11762 1726853268.24613: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 13:27:48 -0400 (0:00:00.067) 0:00:18.677 ****** 11762 1726853268.24716: entering _queue_task() for managed_node2/include_tasks 11762 1726853268.25163: worker is 1 (out of 1 available) 11762 1726853268.25178: exiting _queue_task() for managed_node2/include_tasks 11762 1726853268.25198: done queuing things up, now waiting for results queue to drain 11762 1726853268.25199: waiting for pending results... 11762 1726853268.25539: running TaskExecutor() for managed_node2/TASK: Asserts 11762 1726853268.25679: in run() - task 02083763-bbaf-d845-03d0-00000000008d 11762 1726853268.25713: variable 'ansible_search_path' from source: unknown 11762 1726853268.25733: variable 'ansible_search_path' from source: unknown 11762 1726853268.25783: variable 'lsr_assert' from source: include params 11762 1726853268.26040: variable 'lsr_assert' from source: include params 11762 1726853268.26161: variable 'omit' from source: magic vars 11762 1726853268.26341: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.26362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.26385: variable 'omit' from source: magic vars 11762 1726853268.26646: variable 'ansible_distribution_major_version' from source: facts 11762 1726853268.26660: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853268.26672: variable 'item' from source: unknown 11762 1726853268.26812: variable 'item' from source: unknown 11762 1726853268.26815: variable 'item' from source: unknown 11762 1726853268.26854: variable 'item' from source: unknown 11762 1726853268.27180: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.27183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.27185: variable 'omit' from source: magic vars 11762 1726853268.27226: variable 'ansible_distribution_major_version' from source: facts 11762 1726853268.27277: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853268.27286: variable 'item' from source: unknown 11762 1726853268.27327: variable 'item' from source: unknown 11762 1726853268.27377: variable 'item' from source: unknown 11762 1726853268.27508: variable 'item' from source: unknown 11762 1726853268.27617: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.27621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.27623: variable 'omit' from source: magic vars 11762 1726853268.27780: variable 'ansible_distribution_major_version' from source: facts 11762 1726853268.27792: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853268.27801: variable 'item' from source: unknown 11762 1726853268.27876: variable 'item' from source: unknown 11762 1726853268.27946: variable 'item' from source: unknown 11762 1726853268.27982: variable 'item' from source: unknown 11762 1726853268.28275: dumping result to json 11762 1726853268.28279: done dumping result, returning 11762 1726853268.28284: done running TaskExecutor() for managed_node2/TASK: Asserts [02083763-bbaf-d845-03d0-00000000008d] 11762 1726853268.28290: sending task result for task 02083763-bbaf-d845-03d0-00000000008d 11762 1726853268.28330: done sending task result for task 02083763-bbaf-d845-03d0-00000000008d 11762 1726853268.28333: WORKER PROCESS EXITING 11762 1726853268.28367: no more pending results, returning what we have 11762 1726853268.28375: in VariableManager get_vars() 11762 1726853268.28413: Calling all_inventory to load vars for managed_node2 11762 1726853268.28416: Calling groups_inventory to load vars for managed_node2 11762 1726853268.28419: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853268.28434: Calling all_plugins_play to load vars for managed_node2 11762 1726853268.28438: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853268.28449: Calling groups_plugins_play to load vars for managed_node2 11762 1726853268.30002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853268.30891: done with get_vars() 11762 1726853268.30907: variable 'ansible_search_path' from source: unknown 11762 1726853268.30907: variable 'ansible_search_path' from source: unknown 11762 1726853268.30937: variable 'ansible_search_path' from source: unknown 11762 1726853268.30938: variable 'ansible_search_path' from source: unknown 11762 1726853268.30957: variable 'ansible_search_path' from source: unknown 11762 1726853268.30958: variable 'ansible_search_path' from source: unknown 11762 1726853268.30978: we have included files to process 11762 1726853268.30979: generating all_blocks data 11762 1726853268.30980: done generating all_blocks data 11762 1726853268.30984: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 11762 1726853268.30984: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 11762 1726853268.30986: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 11762 1726853268.31099: in VariableManager get_vars() 11762 1726853268.31113: done with get_vars() 11762 1726853268.31117: variable 'item' from source: include params 11762 1726853268.31195: variable 'item' from source: include params 11762 1726853268.31218: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11762 1726853268.31270: in VariableManager get_vars() 11762 1726853268.31285: done with get_vars() 11762 1726853268.31369: done processing included file 11762 1726853268.31372: iterating over new_blocks loaded from include file 11762 1726853268.31374: in VariableManager get_vars() 11762 1726853268.31382: done with get_vars() 11762 1726853268.31383: filtering new block on tags 11762 1726853268.31412: done filtering new block on tags 11762 1726853268.31414: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml for managed_node2 => (item=tasks/assert_controller_device_present.yml) 11762 1726853268.31418: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 11762 1726853268.31419: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 11762 1726853268.31422: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 11762 1726853268.31504: in VariableManager get_vars() 11762 1726853268.31517: done with get_vars() 11762 1726853268.31524: done processing included file 11762 1726853268.31525: iterating over new_blocks loaded from include file 11762 1726853268.31526: in VariableManager get_vars() 11762 1726853268.31535: done with get_vars() 11762 1726853268.31536: filtering new block on tags 11762 1726853268.31551: done filtering new block on tags 11762 1726853268.31552: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml for managed_node2 => (item=tasks/assert_bond_port_profile_present.yml) 11762 1726853268.31555: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11762 1726853268.31555: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11762 1726853268.31561: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11762 1726853268.31763: in VariableManager get_vars() 11762 1726853268.31787: done with get_vars() 11762 1726853268.31818: in VariableManager get_vars() 11762 1726853268.31830: done with get_vars() 11762 1726853268.31844: done processing included file 11762 1726853268.31846: iterating over new_blocks loaded from include file 11762 1726853268.31847: in VariableManager get_vars() 11762 1726853268.31865: done with get_vars() 11762 1726853268.31867: filtering new block on tags 11762 1726853268.31906: done filtering new block on tags 11762 1726853268.31908: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed_node2 => (item=tasks/assert_bond_options.yml) 11762 1726853268.31911: extending task lists for all hosts with included blocks 11762 1726853268.33212: done extending task lists 11762 1726853268.33213: done processing included files 11762 1726853268.33214: results queue empty 11762 1726853268.33214: checking for any_errors_fatal 11762 1726853268.33218: done checking for any_errors_fatal 11762 1726853268.33219: checking for max_fail_percentage 11762 1726853268.33220: done checking for max_fail_percentage 11762 1726853268.33220: checking to see if all hosts have failed and the running result is not ok 11762 1726853268.33221: done checking to see if all hosts have failed 11762 1726853268.33221: getting the remaining hosts for this loop 11762 1726853268.33222: done getting the remaining hosts for this loop 11762 1726853268.33223: getting the next task for host managed_node2 11762 1726853268.33227: done getting next task for host managed_node2 11762 1726853268.33228: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11762 1726853268.33230: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853268.33232: getting variables 11762 1726853268.33232: in VariableManager get_vars() 11762 1726853268.33238: Calling all_inventory to load vars for managed_node2 11762 1726853268.33240: Calling groups_inventory to load vars for managed_node2 11762 1726853268.33242: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853268.33248: Calling all_plugins_play to load vars for managed_node2 11762 1726853268.33250: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853268.33251: Calling groups_plugins_play to load vars for managed_node2 11762 1726853268.33883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853268.34721: done with get_vars() 11762 1726853268.34737: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:27:48 -0400 (0:00:00.100) 0:00:18.778 ****** 11762 1726853268.34798: entering _queue_task() for managed_node2/include_tasks 11762 1726853268.35098: worker is 1 (out of 1 available) 11762 1726853268.35112: exiting _queue_task() for managed_node2/include_tasks 11762 1726853268.35124: done queuing things up, now waiting for results queue to drain 11762 1726853268.35126: waiting for pending results... 11762 1726853268.35501: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11762 1726853268.35585: in run() - task 02083763-bbaf-d845-03d0-0000000003f5 11762 1726853268.35610: variable 'ansible_search_path' from source: unknown 11762 1726853268.35618: variable 'ansible_search_path' from source: unknown 11762 1726853268.35704: calling self._execute() 11762 1726853268.35761: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.35777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.35793: variable 'omit' from source: magic vars 11762 1726853268.36191: variable 'ansible_distribution_major_version' from source: facts 11762 1726853268.36209: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853268.36249: _execute() done 11762 1726853268.36253: dumping result to json 11762 1726853268.36255: done dumping result, returning 11762 1726853268.36258: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-d845-03d0-0000000003f5] 11762 1726853268.36260: sending task result for task 02083763-bbaf-d845-03d0-0000000003f5 11762 1726853268.36445: no more pending results, returning what we have 11762 1726853268.36453: in VariableManager get_vars() 11762 1726853268.36509: Calling all_inventory to load vars for managed_node2 11762 1726853268.36512: Calling groups_inventory to load vars for managed_node2 11762 1726853268.36515: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853268.36548: Calling all_plugins_play to load vars for managed_node2 11762 1726853268.36552: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853268.36556: Calling groups_plugins_play to load vars for managed_node2 11762 1726853268.37086: done sending task result for task 02083763-bbaf-d845-03d0-0000000003f5 11762 1726853268.37089: WORKER PROCESS EXITING 11762 1726853268.37445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853268.42149: done with get_vars() 11762 1726853268.42173: variable 'ansible_search_path' from source: unknown 11762 1726853268.42174: variable 'ansible_search_path' from source: unknown 11762 1726853268.42214: we have included files to process 11762 1726853268.42215: generating all_blocks data 11762 1726853268.42217: done generating all_blocks data 11762 1726853268.42218: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853268.42219: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853268.42221: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853268.42384: done processing included file 11762 1726853268.42386: iterating over new_blocks loaded from include file 11762 1726853268.42388: in VariableManager get_vars() 11762 1726853268.42404: done with get_vars() 11762 1726853268.42406: filtering new block on tags 11762 1726853268.42433: done filtering new block on tags 11762 1726853268.42436: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11762 1726853268.42440: extending task lists for all hosts with included blocks 11762 1726853268.42637: done extending task lists 11762 1726853268.42638: done processing included files 11762 1726853268.42639: results queue empty 11762 1726853268.42640: checking for any_errors_fatal 11762 1726853268.42645: done checking for any_errors_fatal 11762 1726853268.42645: checking for max_fail_percentage 11762 1726853268.42646: done checking for max_fail_percentage 11762 1726853268.42647: checking to see if all hosts have failed and the running result is not ok 11762 1726853268.42648: done checking to see if all hosts have failed 11762 1726853268.42649: getting the remaining hosts for this loop 11762 1726853268.42650: done getting the remaining hosts for this loop 11762 1726853268.42652: getting the next task for host managed_node2 11762 1726853268.42656: done getting next task for host managed_node2 11762 1726853268.42657: ^ task is: TASK: Get stat for interface {{ interface }} 11762 1726853268.42660: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853268.42662: getting variables 11762 1726853268.42663: in VariableManager get_vars() 11762 1726853268.42673: Calling all_inventory to load vars for managed_node2 11762 1726853268.42675: Calling groups_inventory to load vars for managed_node2 11762 1726853268.42678: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853268.42683: Calling all_plugins_play to load vars for managed_node2 11762 1726853268.42685: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853268.42688: Calling groups_plugins_play to load vars for managed_node2 11762 1726853268.43562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853268.44403: done with get_vars() 11762 1726853268.44419: done getting variables 11762 1726853268.44519: variable 'interface' from source: task vars 11762 1726853268.44522: variable 'controller_device' from source: play vars 11762 1726853268.44564: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:27:48 -0400 (0:00:00.097) 0:00:18.876 ****** 11762 1726853268.44587: entering _queue_task() for managed_node2/stat 11762 1726853268.44851: worker is 1 (out of 1 available) 11762 1726853268.44865: exiting _queue_task() for managed_node2/stat 11762 1726853268.44881: done queuing things up, now waiting for results queue to drain 11762 1726853268.44884: waiting for pending results... 11762 1726853268.45063: running TaskExecutor() for managed_node2/TASK: Get stat for interface nm-bond 11762 1726853268.45161: in run() - task 02083763-bbaf-d845-03d0-0000000004af 11762 1726853268.45175: variable 'ansible_search_path' from source: unknown 11762 1726853268.45179: variable 'ansible_search_path' from source: unknown 11762 1726853268.45206: calling self._execute() 11762 1726853268.45274: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.45280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.45288: variable 'omit' from source: magic vars 11762 1726853268.45564: variable 'ansible_distribution_major_version' from source: facts 11762 1726853268.45575: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853268.45581: variable 'omit' from source: magic vars 11762 1726853268.45624: variable 'omit' from source: magic vars 11762 1726853268.45693: variable 'interface' from source: task vars 11762 1726853268.45696: variable 'controller_device' from source: play vars 11762 1726853268.45740: variable 'controller_device' from source: play vars 11762 1726853268.45760: variable 'omit' from source: magic vars 11762 1726853268.45792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853268.45818: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853268.45835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853268.45849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853268.45859: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853268.45885: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853268.45889: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.45892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.45956: Set connection var ansible_timeout to 10 11762 1726853268.45959: Set connection var ansible_shell_type to sh 11762 1726853268.45963: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853268.45968: Set connection var ansible_shell_executable to /bin/sh 11762 1726853268.45978: Set connection var ansible_pipelining to False 11762 1726853268.45984: Set connection var ansible_connection to ssh 11762 1726853268.46002: variable 'ansible_shell_executable' from source: unknown 11762 1726853268.46005: variable 'ansible_connection' from source: unknown 11762 1726853268.46008: variable 'ansible_module_compression' from source: unknown 11762 1726853268.46011: variable 'ansible_shell_type' from source: unknown 11762 1726853268.46014: variable 'ansible_shell_executable' from source: unknown 11762 1726853268.46016: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.46018: variable 'ansible_pipelining' from source: unknown 11762 1726853268.46021: variable 'ansible_timeout' from source: unknown 11762 1726853268.46024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.46170: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853268.46181: variable 'omit' from source: magic vars 11762 1726853268.46185: starting attempt loop 11762 1726853268.46189: running the handler 11762 1726853268.46204: _low_level_execute_command(): starting 11762 1726853268.46209: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853268.46718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853268.46722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853268.46725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853268.46728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853268.46730: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853268.46781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853268.46785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853268.46960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853268.48690: stdout chunk (state=3): >>>/root <<< 11762 1726853268.48820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853268.48857: stderr chunk (state=3): >>><<< 11762 1726853268.48859: stdout chunk (state=3): >>><<< 11762 1726853268.48878: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853268.48899: _low_level_execute_command(): starting 11762 1726853268.48905: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167 `" && echo ansible-tmp-1726853268.4888384-12602-45821359846167="` echo /root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167 `" ) && sleep 0' 11762 1726853268.49356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853268.49360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853268.49374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853268.49377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853268.49379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853268.49417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853268.49422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853268.49426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853268.49502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853268.51553: stdout chunk (state=3): >>>ansible-tmp-1726853268.4888384-12602-45821359846167=/root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167 <<< 11762 1726853268.51668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853268.51698: stderr chunk (state=3): >>><<< 11762 1726853268.51701: stdout chunk (state=3): >>><<< 11762 1726853268.51721: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853268.4888384-12602-45821359846167=/root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853268.51762: variable 'ansible_module_compression' from source: unknown 11762 1726853268.51807: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11762 1726853268.51844: variable 'ansible_facts' from source: unknown 11762 1726853268.51896: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167/AnsiballZ_stat.py 11762 1726853268.52001: Sending initial data 11762 1726853268.52004: Sent initial data (152 bytes) 11762 1726853268.52432: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853268.52440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853268.52467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853268.52475: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853268.52478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853268.52480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853268.52531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853268.52537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853268.52539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853268.52609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853268.54281: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853268.54527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853268.54612: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpocnned5d /root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167/AnsiballZ_stat.py <<< 11762 1726853268.54616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167/AnsiballZ_stat.py" <<< 11762 1726853268.54696: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpocnned5d" to remote "/root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167/AnsiballZ_stat.py" <<< 11762 1726853268.55639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853268.55680: stderr chunk (state=3): >>><<< 11762 1726853268.55684: stdout chunk (state=3): >>><<< 11762 1726853268.55692: done transferring module to remote 11762 1726853268.55707: _low_level_execute_command(): starting 11762 1726853268.55716: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167/ /root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167/AnsiballZ_stat.py && sleep 0' 11762 1726853268.56456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853268.56475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853268.56557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853268.56614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853268.56693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853268.58615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853268.58636: stderr chunk (state=3): >>><<< 11762 1726853268.58639: stdout chunk (state=3): >>><<< 11762 1726853268.58655: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853268.58659: _low_level_execute_command(): starting 11762 1726853268.58662: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167/AnsiballZ_stat.py && sleep 0' 11762 1726853268.59146: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853268.59150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853268.59152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853268.59155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853268.59206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853268.59210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853268.59215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853268.59293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853268.75298: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27303, "dev": 23, "nlink": 1, "atime": 1726853267.0915518, "mtime": 1726853267.0915518, "ctime": 1726853267.0915518, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11762 1726853268.76979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853268.76984: stdout chunk (state=3): >>><<< 11762 1726853268.76987: stderr chunk (state=3): >>><<< 11762 1726853268.76990: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27303, "dev": 23, "nlink": 1, "atime": 1726853267.0915518, "mtime": 1726853267.0915518, "ctime": 1726853267.0915518, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853268.76993: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853268.77009: _low_level_execute_command(): starting 11762 1726853268.77021: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853268.4888384-12602-45821359846167/ > /dev/null 2>&1 && sleep 0' 11762 1726853268.77626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853268.77640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853268.77653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853268.77670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853268.77690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853268.77701: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853268.77714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853268.77734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853268.77747: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853268.77839: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853268.77864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853268.78004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853268.80038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853268.80059: stdout chunk (state=3): >>><<< 11762 1726853268.80070: stderr chunk (state=3): >>><<< 11762 1726853268.80095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853268.80107: handler run complete 11762 1726853268.80176: attempt loop complete, returning result 11762 1726853268.80184: _execute() done 11762 1726853268.80190: dumping result to json 11762 1726853268.80200: done dumping result, returning 11762 1726853268.80211: done running TaskExecutor() for managed_node2/TASK: Get stat for interface nm-bond [02083763-bbaf-d845-03d0-0000000004af] 11762 1726853268.80220: sending task result for task 02083763-bbaf-d845-03d0-0000000004af ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726853267.0915518, "block_size": 4096, "blocks": 0, "ctime": 1726853267.0915518, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27303, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726853267.0915518, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11762 1726853268.80564: no more pending results, returning what we have 11762 1726853268.80569: results queue empty 11762 1726853268.80569: checking for any_errors_fatal 11762 1726853268.80572: done checking for any_errors_fatal 11762 1726853268.80573: checking for max_fail_percentage 11762 1726853268.80575: done checking for max_fail_percentage 11762 1726853268.80576: checking to see if all hosts have failed and the running result is not ok 11762 1726853268.80576: done checking to see if all hosts have failed 11762 1726853268.80577: getting the remaining hosts for this loop 11762 1726853268.80579: done getting the remaining hosts for this loop 11762 1726853268.80583: getting the next task for host managed_node2 11762 1726853268.80592: done getting next task for host managed_node2 11762 1726853268.80594: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11762 1726853268.80598: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853268.80603: getting variables 11762 1726853268.80604: in VariableManager get_vars() 11762 1726853268.80640: Calling all_inventory to load vars for managed_node2 11762 1726853268.80645: Calling groups_inventory to load vars for managed_node2 11762 1726853268.80649: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853268.80661: Calling all_plugins_play to load vars for managed_node2 11762 1726853268.80664: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853268.80667: Calling groups_plugins_play to load vars for managed_node2 11762 1726853268.81585: done sending task result for task 02083763-bbaf-d845-03d0-0000000004af 11762 1726853268.81589: WORKER PROCESS EXITING 11762 1726853268.83292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853268.84890: done with get_vars() 11762 1726853268.84915: done getting variables 11762 1726853268.84984: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853268.85125: variable 'interface' from source: task vars 11762 1726853268.85129: variable 'controller_device' from source: play vars 11762 1726853268.85192: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:27:48 -0400 (0:00:00.406) 0:00:19.282 ****** 11762 1726853268.85230: entering _queue_task() for managed_node2/assert 11762 1726853268.85597: worker is 1 (out of 1 available) 11762 1726853268.85612: exiting _queue_task() for managed_node2/assert 11762 1726853268.85625: done queuing things up, now waiting for results queue to drain 11762 1726853268.85627: waiting for pending results... 11762 1726853268.85933: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'nm-bond' 11762 1726853268.86080: in run() - task 02083763-bbaf-d845-03d0-0000000003f6 11762 1726853268.86104: variable 'ansible_search_path' from source: unknown 11762 1726853268.86111: variable 'ansible_search_path' from source: unknown 11762 1726853268.86152: calling self._execute() 11762 1726853268.86254: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.86265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.86279: variable 'omit' from source: magic vars 11762 1726853268.86676: variable 'ansible_distribution_major_version' from source: facts 11762 1726853268.86695: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853268.86706: variable 'omit' from source: magic vars 11762 1726853268.86783: variable 'omit' from source: magic vars 11762 1726853268.86889: variable 'interface' from source: task vars 11762 1726853268.86900: variable 'controller_device' from source: play vars 11762 1726853268.86978: variable 'controller_device' from source: play vars 11762 1726853268.87003: variable 'omit' from source: magic vars 11762 1726853268.87051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853268.87103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853268.87128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853268.87155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853268.87274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853268.87279: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853268.87282: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.87290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.87346: Set connection var ansible_timeout to 10 11762 1726853268.87356: Set connection var ansible_shell_type to sh 11762 1726853268.87367: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853268.87380: Set connection var ansible_shell_executable to /bin/sh 11762 1726853268.87402: Set connection var ansible_pipelining to False 11762 1726853268.87416: Set connection var ansible_connection to ssh 11762 1726853268.87447: variable 'ansible_shell_executable' from source: unknown 11762 1726853268.87457: variable 'ansible_connection' from source: unknown 11762 1726853268.87465: variable 'ansible_module_compression' from source: unknown 11762 1726853268.87474: variable 'ansible_shell_type' from source: unknown 11762 1726853268.87482: variable 'ansible_shell_executable' from source: unknown 11762 1726853268.87488: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.87496: variable 'ansible_pipelining' from source: unknown 11762 1726853268.87509: variable 'ansible_timeout' from source: unknown 11762 1726853268.87518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.87678: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853268.87728: variable 'omit' from source: magic vars 11762 1726853268.87731: starting attempt loop 11762 1726853268.87733: running the handler 11762 1726853268.87861: variable 'interface_stat' from source: set_fact 11762 1726853268.87890: Evaluated conditional (interface_stat.stat.exists): True 11762 1726853268.87902: handler run complete 11762 1726853268.87949: attempt loop complete, returning result 11762 1726853268.87952: _execute() done 11762 1726853268.87955: dumping result to json 11762 1726853268.87957: done dumping result, returning 11762 1726853268.87959: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'nm-bond' [02083763-bbaf-d845-03d0-0000000003f6] 11762 1726853268.87965: sending task result for task 02083763-bbaf-d845-03d0-0000000003f6 11762 1726853268.88123: done sending task result for task 02083763-bbaf-d845-03d0-0000000003f6 11762 1726853268.88126: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853268.88212: no more pending results, returning what we have 11762 1726853268.88216: results queue empty 11762 1726853268.88217: checking for any_errors_fatal 11762 1726853268.88229: done checking for any_errors_fatal 11762 1726853268.88229: checking for max_fail_percentage 11762 1726853268.88232: done checking for max_fail_percentage 11762 1726853268.88232: checking to see if all hosts have failed and the running result is not ok 11762 1726853268.88233: done checking to see if all hosts have failed 11762 1726853268.88234: getting the remaining hosts for this loop 11762 1726853268.88236: done getting the remaining hosts for this loop 11762 1726853268.88239: getting the next task for host managed_node2 11762 1726853268.88253: done getting next task for host managed_node2 11762 1726853268.88256: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11762 1726853268.88260: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853268.88264: getting variables 11762 1726853268.88266: in VariableManager get_vars() 11762 1726853268.88305: Calling all_inventory to load vars for managed_node2 11762 1726853268.88308: Calling groups_inventory to load vars for managed_node2 11762 1726853268.88312: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853268.88337: Calling all_plugins_play to load vars for managed_node2 11762 1726853268.88341: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853268.88347: Calling groups_plugins_play to load vars for managed_node2 11762 1726853268.90037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853268.91727: done with get_vars() 11762 1726853268.91756: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml:3 Friday 20 September 2024 13:27:48 -0400 (0:00:00.066) 0:00:19.349 ****** 11762 1726853268.91866: entering _queue_task() for managed_node2/include_tasks 11762 1726853268.92228: worker is 1 (out of 1 available) 11762 1726853268.92244: exiting _queue_task() for managed_node2/include_tasks 11762 1726853268.92259: done queuing things up, now waiting for results queue to drain 11762 1726853268.92260: waiting for pending results... 11762 1726853268.92682: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' 11762 1726853268.92720: in run() - task 02083763-bbaf-d845-03d0-0000000003fb 11762 1726853268.92742: variable 'ansible_search_path' from source: unknown 11762 1726853268.92754: variable 'ansible_search_path' from source: unknown 11762 1726853268.92817: variable 'controller_profile' from source: play vars 11762 1726853268.93041: variable 'controller_profile' from source: play vars 11762 1726853268.93068: variable 'port1_profile' from source: play vars 11762 1726853268.93154: variable 'port1_profile' from source: play vars 11762 1726853268.93166: variable 'port2_profile' from source: play vars 11762 1726853268.93255: variable 'port2_profile' from source: play vars 11762 1726853268.93262: variable 'omit' from source: magic vars 11762 1726853268.93413: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.93474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.93479: variable 'omit' from source: magic vars 11762 1726853268.93727: variable 'ansible_distribution_major_version' from source: facts 11762 1726853268.93746: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853268.93787: variable 'bond_port_profile' from source: unknown 11762 1726853268.93862: variable 'bond_port_profile' from source: unknown 11762 1726853268.94278: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.94281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.94283: variable 'omit' from source: magic vars 11762 1726853268.94286: variable 'ansible_distribution_major_version' from source: facts 11762 1726853268.94288: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853268.94309: variable 'bond_port_profile' from source: unknown 11762 1726853268.94374: variable 'bond_port_profile' from source: unknown 11762 1726853268.94575: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853268.94579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853268.94581: variable 'omit' from source: magic vars 11762 1726853268.94688: variable 'ansible_distribution_major_version' from source: facts 11762 1726853268.94698: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853268.94775: variable 'bond_port_profile' from source: unknown 11762 1726853268.94802: variable 'bond_port_profile' from source: unknown 11762 1726853268.94928: dumping result to json 11762 1726853268.94931: done dumping result, returning 11762 1726853268.94933: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' [02083763-bbaf-d845-03d0-0000000003fb] 11762 1726853268.94939: sending task result for task 02083763-bbaf-d845-03d0-0000000003fb 11762 1726853268.95079: no more pending results, returning what we have 11762 1726853268.95085: in VariableManager get_vars() 11762 1726853268.95125: Calling all_inventory to load vars for managed_node2 11762 1726853268.95129: Calling groups_inventory to load vars for managed_node2 11762 1726853268.95132: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853268.95150: Calling all_plugins_play to load vars for managed_node2 11762 1726853268.95160: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853268.95164: Calling groups_plugins_play to load vars for managed_node2 11762 1726853268.95887: done sending task result for task 02083763-bbaf-d845-03d0-0000000003fb 11762 1726853268.95890: WORKER PROCESS EXITING 11762 1726853268.96953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853268.98567: done with get_vars() 11762 1726853268.98591: variable 'ansible_search_path' from source: unknown 11762 1726853268.98592: variable 'ansible_search_path' from source: unknown 11762 1726853268.98602: variable 'item' from source: include params 11762 1726853268.98720: variable 'item' from source: include params 11762 1726853268.98766: variable 'ansible_search_path' from source: unknown 11762 1726853268.98768: variable 'ansible_search_path' from source: unknown 11762 1726853268.98776: variable 'item' from source: include params 11762 1726853268.98834: variable 'item' from source: include params 11762 1726853268.98874: variable 'ansible_search_path' from source: unknown 11762 1726853268.98875: variable 'ansible_search_path' from source: unknown 11762 1726853268.98925: variable 'item' from source: include params 11762 1726853268.98989: variable 'item' from source: include params 11762 1726853268.99018: we have included files to process 11762 1726853268.99020: generating all_blocks data 11762 1726853268.99022: done generating all_blocks data 11762 1726853268.99025: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11762 1726853268.99026: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11762 1726853268.99028: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11762 1726853268.99196: in VariableManager get_vars() 11762 1726853268.99209: done with get_vars() 11762 1726853268.99402: done processing included file 11762 1726853268.99403: iterating over new_blocks loaded from include file 11762 1726853268.99404: in VariableManager get_vars() 11762 1726853268.99414: done with get_vars() 11762 1726853268.99415: filtering new block on tags 11762 1726853268.99452: done filtering new block on tags 11762 1726853268.99453: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0) 11762 1726853268.99457: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11762 1726853268.99458: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11762 1726853268.99460: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11762 1726853268.99521: in VariableManager get_vars() 11762 1726853268.99532: done with get_vars() 11762 1726853268.99701: done processing included file 11762 1726853268.99702: iterating over new_blocks loaded from include file 11762 1726853268.99703: in VariableManager get_vars() 11762 1726853268.99755: done with get_vars() 11762 1726853268.99757: filtering new block on tags 11762 1726853268.99790: done filtering new block on tags 11762 1726853268.99792: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0.0) 11762 1726853268.99794: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11762 1726853268.99795: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11762 1726853268.99797: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11762 1726853268.99863: in VariableManager get_vars() 11762 1726853268.99876: done with get_vars() 11762 1726853269.00032: done processing included file 11762 1726853269.00033: iterating over new_blocks loaded from include file 11762 1726853269.00034: in VariableManager get_vars() 11762 1726853269.00045: done with get_vars() 11762 1726853269.00046: filtering new block on tags 11762 1726853269.00079: done filtering new block on tags 11762 1726853269.00081: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0.1) 11762 1726853269.00083: extending task lists for all hosts with included blocks 11762 1726853269.00141: done extending task lists 11762 1726853269.00144: done processing included files 11762 1726853269.00145: results queue empty 11762 1726853269.00145: checking for any_errors_fatal 11762 1726853269.00147: done checking for any_errors_fatal 11762 1726853269.00148: checking for max_fail_percentage 11762 1726853269.00148: done checking for max_fail_percentage 11762 1726853269.00149: checking to see if all hosts have failed and the running result is not ok 11762 1726853269.00149: done checking to see if all hosts have failed 11762 1726853269.00150: getting the remaining hosts for this loop 11762 1726853269.00151: done getting the remaining hosts for this loop 11762 1726853269.00152: getting the next task for host managed_node2 11762 1726853269.00155: done getting next task for host managed_node2 11762 1726853269.00156: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11762 1726853269.00158: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853269.00160: getting variables 11762 1726853269.00161: in VariableManager get_vars() 11762 1726853269.00167: Calling all_inventory to load vars for managed_node2 11762 1726853269.00168: Calling groups_inventory to load vars for managed_node2 11762 1726853269.00170: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853269.00175: Calling all_plugins_play to load vars for managed_node2 11762 1726853269.00177: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853269.00179: Calling groups_plugins_play to load vars for managed_node2 11762 1726853269.01177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853269.02377: done with get_vars() 11762 1726853269.02396: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:27:49 -0400 (0:00:00.105) 0:00:19.455 ****** 11762 1726853269.02459: entering _queue_task() for managed_node2/include_tasks 11762 1726853269.02800: worker is 1 (out of 1 available) 11762 1726853269.02816: exiting _queue_task() for managed_node2/include_tasks 11762 1726853269.02831: done queuing things up, now waiting for results queue to drain 11762 1726853269.02832: waiting for pending results... 11762 1726853269.03042: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 11762 1726853269.03114: in run() - task 02083763-bbaf-d845-03d0-0000000004d9 11762 1726853269.03127: variable 'ansible_search_path' from source: unknown 11762 1726853269.03130: variable 'ansible_search_path' from source: unknown 11762 1726853269.03163: calling self._execute() 11762 1726853269.03230: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.03236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.03245: variable 'omit' from source: magic vars 11762 1726853269.03524: variable 'ansible_distribution_major_version' from source: facts 11762 1726853269.03534: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853269.03539: _execute() done 11762 1726853269.03542: dumping result to json 11762 1726853269.03548: done dumping result, returning 11762 1726853269.03555: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-d845-03d0-0000000004d9] 11762 1726853269.03560: sending task result for task 02083763-bbaf-d845-03d0-0000000004d9 11762 1726853269.03645: done sending task result for task 02083763-bbaf-d845-03d0-0000000004d9 11762 1726853269.03647: WORKER PROCESS EXITING 11762 1726853269.03675: no more pending results, returning what we have 11762 1726853269.03680: in VariableManager get_vars() 11762 1726853269.03715: Calling all_inventory to load vars for managed_node2 11762 1726853269.03717: Calling groups_inventory to load vars for managed_node2 11762 1726853269.03720: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853269.03732: Calling all_plugins_play to load vars for managed_node2 11762 1726853269.03735: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853269.03737: Calling groups_plugins_play to load vars for managed_node2 11762 1726853269.05133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853269.06736: done with get_vars() 11762 1726853269.06758: variable 'ansible_search_path' from source: unknown 11762 1726853269.06760: variable 'ansible_search_path' from source: unknown 11762 1726853269.06799: we have included files to process 11762 1726853269.06800: generating all_blocks data 11762 1726853269.06801: done generating all_blocks data 11762 1726853269.06803: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11762 1726853269.06804: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11762 1726853269.06806: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11762 1726853269.07900: done processing included file 11762 1726853269.07902: iterating over new_blocks loaded from include file 11762 1726853269.07904: in VariableManager get_vars() 11762 1726853269.07920: done with get_vars() 11762 1726853269.07922: filtering new block on tags 11762 1726853269.08041: done filtering new block on tags 11762 1726853269.08046: in VariableManager get_vars() 11762 1726853269.08060: done with get_vars() 11762 1726853269.08062: filtering new block on tags 11762 1726853269.08115: done filtering new block on tags 11762 1726853269.08118: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 11762 1726853269.08122: extending task lists for all hosts with included blocks 11762 1726853269.08481: done extending task lists 11762 1726853269.08482: done processing included files 11762 1726853269.08483: results queue empty 11762 1726853269.08484: checking for any_errors_fatal 11762 1726853269.08488: done checking for any_errors_fatal 11762 1726853269.08488: checking for max_fail_percentage 11762 1726853269.08490: done checking for max_fail_percentage 11762 1726853269.08490: checking to see if all hosts have failed and the running result is not ok 11762 1726853269.08491: done checking to see if all hosts have failed 11762 1726853269.08492: getting the remaining hosts for this loop 11762 1726853269.08493: done getting the remaining hosts for this loop 11762 1726853269.08496: getting the next task for host managed_node2 11762 1726853269.08501: done getting next task for host managed_node2 11762 1726853269.08503: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11762 1726853269.08506: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853269.08509: getting variables 11762 1726853269.08510: in VariableManager get_vars() 11762 1726853269.08519: Calling all_inventory to load vars for managed_node2 11762 1726853269.08521: Calling groups_inventory to load vars for managed_node2 11762 1726853269.08524: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853269.08530: Calling all_plugins_play to load vars for managed_node2 11762 1726853269.08533: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853269.08536: Calling groups_plugins_play to load vars for managed_node2 11762 1726853269.10615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853269.12105: done with get_vars() 11762 1726853269.12122: done getting variables 11762 1726853269.12154: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:27:49 -0400 (0:00:00.097) 0:00:19.552 ****** 11762 1726853269.12178: entering _queue_task() for managed_node2/set_fact 11762 1726853269.12431: worker is 1 (out of 1 available) 11762 1726853269.12444: exiting _queue_task() for managed_node2/set_fact 11762 1726853269.12458: done queuing things up, now waiting for results queue to drain 11762 1726853269.12460: waiting for pending results... 11762 1726853269.12650: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 11762 1726853269.12801: in run() - task 02083763-bbaf-d845-03d0-0000000004fc 11762 1726853269.12805: variable 'ansible_search_path' from source: unknown 11762 1726853269.12808: variable 'ansible_search_path' from source: unknown 11762 1726853269.12847: calling self._execute() 11762 1726853269.12915: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.12949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.12953: variable 'omit' from source: magic vars 11762 1726853269.13310: variable 'ansible_distribution_major_version' from source: facts 11762 1726853269.13377: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853269.13381: variable 'omit' from source: magic vars 11762 1726853269.13392: variable 'omit' from source: magic vars 11762 1726853269.13428: variable 'omit' from source: magic vars 11762 1726853269.13468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853269.13505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853269.13526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853269.13543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853269.13558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853269.13589: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853269.13593: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.13595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.13763: Set connection var ansible_timeout to 10 11762 1726853269.13767: Set connection var ansible_shell_type to sh 11762 1726853269.13769: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853269.13775: Set connection var ansible_shell_executable to /bin/sh 11762 1726853269.13777: Set connection var ansible_pipelining to False 11762 1726853269.13780: Set connection var ansible_connection to ssh 11762 1726853269.13782: variable 'ansible_shell_executable' from source: unknown 11762 1726853269.13784: variable 'ansible_connection' from source: unknown 11762 1726853269.13787: variable 'ansible_module_compression' from source: unknown 11762 1726853269.13789: variable 'ansible_shell_type' from source: unknown 11762 1726853269.13791: variable 'ansible_shell_executable' from source: unknown 11762 1726853269.13793: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.13796: variable 'ansible_pipelining' from source: unknown 11762 1726853269.13798: variable 'ansible_timeout' from source: unknown 11762 1726853269.13800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.13932: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853269.13936: variable 'omit' from source: magic vars 11762 1726853269.13939: starting attempt loop 11762 1726853269.13941: running the handler 11762 1726853269.13954: handler run complete 11762 1726853269.13973: attempt loop complete, returning result 11762 1726853269.13976: _execute() done 11762 1726853269.13979: dumping result to json 11762 1726853269.13981: done dumping result, returning 11762 1726853269.13988: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-d845-03d0-0000000004fc] 11762 1726853269.13993: sending task result for task 02083763-bbaf-d845-03d0-0000000004fc ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11762 1726853269.14154: no more pending results, returning what we have 11762 1726853269.14157: results queue empty 11762 1726853269.14158: checking for any_errors_fatal 11762 1726853269.14160: done checking for any_errors_fatal 11762 1726853269.14161: checking for max_fail_percentage 11762 1726853269.14162: done checking for max_fail_percentage 11762 1726853269.14163: checking to see if all hosts have failed and the running result is not ok 11762 1726853269.14164: done checking to see if all hosts have failed 11762 1726853269.14164: getting the remaining hosts for this loop 11762 1726853269.14166: done getting the remaining hosts for this loop 11762 1726853269.14170: getting the next task for host managed_node2 11762 1726853269.14182: done getting next task for host managed_node2 11762 1726853269.14184: ^ task is: TASK: Stat profile file 11762 1726853269.14191: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853269.14195: getting variables 11762 1726853269.14196: in VariableManager get_vars() 11762 1726853269.14229: Calling all_inventory to load vars for managed_node2 11762 1726853269.14232: Calling groups_inventory to load vars for managed_node2 11762 1726853269.14235: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853269.14245: Calling all_plugins_play to load vars for managed_node2 11762 1726853269.14248: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853269.14250: Calling groups_plugins_play to load vars for managed_node2 11762 1726853269.14785: done sending task result for task 02083763-bbaf-d845-03d0-0000000004fc 11762 1726853269.14788: WORKER PROCESS EXITING 11762 1726853269.15060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853269.16014: done with get_vars() 11762 1726853269.16039: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:27:49 -0400 (0:00:00.039) 0:00:19.591 ****** 11762 1726853269.16138: entering _queue_task() for managed_node2/stat 11762 1726853269.16441: worker is 1 (out of 1 available) 11762 1726853269.16456: exiting _queue_task() for managed_node2/stat 11762 1726853269.16470: done queuing things up, now waiting for results queue to drain 11762 1726853269.16473: waiting for pending results... 11762 1726853269.16809: running TaskExecutor() for managed_node2/TASK: Stat profile file 11762 1726853269.16908: in run() - task 02083763-bbaf-d845-03d0-0000000004fd 11762 1726853269.16928: variable 'ansible_search_path' from source: unknown 11762 1726853269.16935: variable 'ansible_search_path' from source: unknown 11762 1726853269.16976: calling self._execute() 11762 1726853269.17069: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.17086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.17101: variable 'omit' from source: magic vars 11762 1726853269.17558: variable 'ansible_distribution_major_version' from source: facts 11762 1726853269.17591: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853269.17594: variable 'omit' from source: magic vars 11762 1726853269.17632: variable 'omit' from source: magic vars 11762 1726853269.17702: variable 'profile' from source: include params 11762 1726853269.17705: variable 'bond_port_profile' from source: include params 11762 1726853269.17751: variable 'bond_port_profile' from source: include params 11762 1726853269.17782: variable 'omit' from source: magic vars 11762 1726853269.17814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853269.17846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853269.17860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853269.17875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853269.17889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853269.17908: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853269.17911: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.17916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.17983: Set connection var ansible_timeout to 10 11762 1726853269.17987: Set connection var ansible_shell_type to sh 11762 1726853269.17995: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853269.17998: Set connection var ansible_shell_executable to /bin/sh 11762 1726853269.18004: Set connection var ansible_pipelining to False 11762 1726853269.18010: Set connection var ansible_connection to ssh 11762 1726853269.18027: variable 'ansible_shell_executable' from source: unknown 11762 1726853269.18030: variable 'ansible_connection' from source: unknown 11762 1726853269.18032: variable 'ansible_module_compression' from source: unknown 11762 1726853269.18034: variable 'ansible_shell_type' from source: unknown 11762 1726853269.18037: variable 'ansible_shell_executable' from source: unknown 11762 1726853269.18040: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.18047: variable 'ansible_pipelining' from source: unknown 11762 1726853269.18049: variable 'ansible_timeout' from source: unknown 11762 1726853269.18052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.18195: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853269.18204: variable 'omit' from source: magic vars 11762 1726853269.18210: starting attempt loop 11762 1726853269.18213: running the handler 11762 1726853269.18226: _low_level_execute_command(): starting 11762 1726853269.18233: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853269.18720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.18723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.18726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853269.18729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.18775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853269.18791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.18875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853269.20709: stdout chunk (state=3): >>>/root <<< 11762 1726853269.20852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853269.20888: stderr chunk (state=3): >>><<< 11762 1726853269.20890: stdout chunk (state=3): >>><<< 11762 1726853269.20904: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853269.20976: _low_level_execute_command(): starting 11762 1726853269.20980: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673 `" && echo ansible-tmp-1726853269.2091184-12659-19986789933673="` echo /root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673 `" ) && sleep 0' 11762 1726853269.21348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.21352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.21380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.21424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853269.21429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853269.21432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.21500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853269.23541: stdout chunk (state=3): >>>ansible-tmp-1726853269.2091184-12659-19986789933673=/root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673 <<< 11762 1726853269.23646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853269.23674: stderr chunk (state=3): >>><<< 11762 1726853269.23677: stdout chunk (state=3): >>><<< 11762 1726853269.23694: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853269.2091184-12659-19986789933673=/root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853269.23733: variable 'ansible_module_compression' from source: unknown 11762 1726853269.23778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11762 1726853269.23813: variable 'ansible_facts' from source: unknown 11762 1726853269.23863: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673/AnsiballZ_stat.py 11762 1726853269.23963: Sending initial data 11762 1726853269.23967: Sent initial data (152 bytes) 11762 1726853269.24403: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.24406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853269.24409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.24411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.24414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.24460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.24591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853269.26261: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853269.26327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853269.26397: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpw8z0uc3v /root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673/AnsiballZ_stat.py <<< 11762 1726853269.26404: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673/AnsiballZ_stat.py" <<< 11762 1726853269.26465: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpw8z0uc3v" to remote "/root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673/AnsiballZ_stat.py" <<< 11762 1726853269.26473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673/AnsiballZ_stat.py" <<< 11762 1726853269.27111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853269.27158: stderr chunk (state=3): >>><<< 11762 1726853269.27161: stdout chunk (state=3): >>><<< 11762 1726853269.27183: done transferring module to remote 11762 1726853269.27193: _low_level_execute_command(): starting 11762 1726853269.27197: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673/ /root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673/AnsiballZ_stat.py && sleep 0' 11762 1726853269.27634: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.27644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853269.27664: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853269.27668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.27728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853269.27731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853269.27734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.27806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853269.29695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853269.29710: stderr chunk (state=3): >>><<< 11762 1726853269.29713: stdout chunk (state=3): >>><<< 11762 1726853269.29731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853269.29735: _low_level_execute_command(): starting 11762 1726853269.29738: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673/AnsiballZ_stat.py && sleep 0' 11762 1726853269.30204: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.30207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.30210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.30212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853269.30214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.30265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853269.30268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.30348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853269.45891: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11762 1726853269.47277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853269.47281: stdout chunk (state=3): >>><<< 11762 1726853269.47478: stderr chunk (state=3): >>><<< 11762 1726853269.47483: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853269.47486: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853269.47489: _low_level_execute_command(): starting 11762 1726853269.47491: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853269.2091184-12659-19986789933673/ > /dev/null 2>&1 && sleep 0' 11762 1726853269.47997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853269.48006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.48016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.48037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853269.48053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853269.48060: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853269.48070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.48087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853269.48094: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853269.48101: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853269.48143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.48197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853269.48211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853269.48230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.48427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853269.50578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853269.50582: stdout chunk (state=3): >>><<< 11762 1726853269.50584: stderr chunk (state=3): >>><<< 11762 1726853269.50586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853269.50588: handler run complete 11762 1726853269.50589: attempt loop complete, returning result 11762 1726853269.50591: _execute() done 11762 1726853269.50592: dumping result to json 11762 1726853269.50594: done dumping result, returning 11762 1726853269.50595: done running TaskExecutor() for managed_node2/TASK: Stat profile file [02083763-bbaf-d845-03d0-0000000004fd] 11762 1726853269.50597: sending task result for task 02083763-bbaf-d845-03d0-0000000004fd 11762 1726853269.50663: done sending task result for task 02083763-bbaf-d845-03d0-0000000004fd 11762 1726853269.50666: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11762 1726853269.50722: no more pending results, returning what we have 11762 1726853269.50725: results queue empty 11762 1726853269.50726: checking for any_errors_fatal 11762 1726853269.50731: done checking for any_errors_fatal 11762 1726853269.50732: checking for max_fail_percentage 11762 1726853269.50734: done checking for max_fail_percentage 11762 1726853269.50734: checking to see if all hosts have failed and the running result is not ok 11762 1726853269.50735: done checking to see if all hosts have failed 11762 1726853269.50735: getting the remaining hosts for this loop 11762 1726853269.50737: done getting the remaining hosts for this loop 11762 1726853269.50740: getting the next task for host managed_node2 11762 1726853269.50747: done getting next task for host managed_node2 11762 1726853269.50749: ^ task is: TASK: Set NM profile exist flag based on the profile files 11762 1726853269.50754: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853269.50758: getting variables 11762 1726853269.50759: in VariableManager get_vars() 11762 1726853269.50798: Calling all_inventory to load vars for managed_node2 11762 1726853269.50801: Calling groups_inventory to load vars for managed_node2 11762 1726853269.50804: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853269.50813: Calling all_plugins_play to load vars for managed_node2 11762 1726853269.50815: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853269.50817: Calling groups_plugins_play to load vars for managed_node2 11762 1726853269.52220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853269.53815: done with get_vars() 11762 1726853269.53849: done getting variables 11762 1726853269.53913: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:27:49 -0400 (0:00:00.378) 0:00:19.969 ****** 11762 1726853269.53949: entering _queue_task() for managed_node2/set_fact 11762 1726853269.54314: worker is 1 (out of 1 available) 11762 1726853269.54329: exiting _queue_task() for managed_node2/set_fact 11762 1726853269.54342: done queuing things up, now waiting for results queue to drain 11762 1726853269.54343: waiting for pending results... 11762 1726853269.54633: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 11762 1726853269.54796: in run() - task 02083763-bbaf-d845-03d0-0000000004fe 11762 1726853269.54800: variable 'ansible_search_path' from source: unknown 11762 1726853269.54803: variable 'ansible_search_path' from source: unknown 11762 1726853269.54830: calling self._execute() 11762 1726853269.54976: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.54980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.54983: variable 'omit' from source: magic vars 11762 1726853269.55432: variable 'ansible_distribution_major_version' from source: facts 11762 1726853269.55444: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853269.55576: variable 'profile_stat' from source: set_fact 11762 1726853269.55592: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853269.55595: when evaluation is False, skipping this task 11762 1726853269.55598: _execute() done 11762 1726853269.55601: dumping result to json 11762 1726853269.55604: done dumping result, returning 11762 1726853269.55674: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-d845-03d0-0000000004fe] 11762 1726853269.55678: sending task result for task 02083763-bbaf-d845-03d0-0000000004fe 11762 1726853269.55962: done sending task result for task 02083763-bbaf-d845-03d0-0000000004fe 11762 1726853269.55964: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853269.56048: no more pending results, returning what we have 11762 1726853269.56053: results queue empty 11762 1726853269.56054: checking for any_errors_fatal 11762 1726853269.56062: done checking for any_errors_fatal 11762 1726853269.56063: checking for max_fail_percentage 11762 1726853269.56065: done checking for max_fail_percentage 11762 1726853269.56065: checking to see if all hosts have failed and the running result is not ok 11762 1726853269.56066: done checking to see if all hosts have failed 11762 1726853269.56067: getting the remaining hosts for this loop 11762 1726853269.56068: done getting the remaining hosts for this loop 11762 1726853269.56074: getting the next task for host managed_node2 11762 1726853269.56080: done getting next task for host managed_node2 11762 1726853269.56082: ^ task is: TASK: Get NM profile info 11762 1726853269.56089: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853269.56092: getting variables 11762 1726853269.56094: in VariableManager get_vars() 11762 1726853269.56123: Calling all_inventory to load vars for managed_node2 11762 1726853269.56126: Calling groups_inventory to load vars for managed_node2 11762 1726853269.56128: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853269.56138: Calling all_plugins_play to load vars for managed_node2 11762 1726853269.56141: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853269.56143: Calling groups_plugins_play to load vars for managed_node2 11762 1726853269.57927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853269.60827: done with get_vars() 11762 1726853269.60861: done getting variables 11762 1726853269.61057: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:27:49 -0400 (0:00:00.071) 0:00:20.041 ****** 11762 1726853269.61097: entering _queue_task() for managed_node2/shell 11762 1726853269.61711: worker is 1 (out of 1 available) 11762 1726853269.61724: exiting _queue_task() for managed_node2/shell 11762 1726853269.61736: done queuing things up, now waiting for results queue to drain 11762 1726853269.61737: waiting for pending results... 11762 1726853269.62292: running TaskExecutor() for managed_node2/TASK: Get NM profile info 11762 1726853269.62362: in run() - task 02083763-bbaf-d845-03d0-0000000004ff 11762 1726853269.62366: variable 'ansible_search_path' from source: unknown 11762 1726853269.62368: variable 'ansible_search_path' from source: unknown 11762 1726853269.62401: calling self._execute() 11762 1726853269.62526: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.62530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.62577: variable 'omit' from source: magic vars 11762 1726853269.63280: variable 'ansible_distribution_major_version' from source: facts 11762 1726853269.63284: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853269.63286: variable 'omit' from source: magic vars 11762 1726853269.63289: variable 'omit' from source: magic vars 11762 1726853269.63314: variable 'profile' from source: include params 11762 1726853269.63319: variable 'bond_port_profile' from source: include params 11762 1726853269.63382: variable 'bond_port_profile' from source: include params 11762 1726853269.63520: variable 'omit' from source: magic vars 11762 1726853269.63559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853269.63632: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853269.63651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853269.63668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853269.63681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853269.63707: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853269.63710: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.63831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.64077: Set connection var ansible_timeout to 10 11762 1726853269.64080: Set connection var ansible_shell_type to sh 11762 1726853269.64083: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853269.64085: Set connection var ansible_shell_executable to /bin/sh 11762 1726853269.64087: Set connection var ansible_pipelining to False 11762 1726853269.64090: Set connection var ansible_connection to ssh 11762 1726853269.64092: variable 'ansible_shell_executable' from source: unknown 11762 1726853269.64094: variable 'ansible_connection' from source: unknown 11762 1726853269.64095: variable 'ansible_module_compression' from source: unknown 11762 1726853269.64157: variable 'ansible_shell_type' from source: unknown 11762 1726853269.64160: variable 'ansible_shell_executable' from source: unknown 11762 1726853269.64163: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853269.64168: variable 'ansible_pipelining' from source: unknown 11762 1726853269.64173: variable 'ansible_timeout' from source: unknown 11762 1726853269.64175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853269.64423: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853269.64776: variable 'omit' from source: magic vars 11762 1726853269.64780: starting attempt loop 11762 1726853269.64782: running the handler 11762 1726853269.64785: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853269.64787: _low_level_execute_command(): starting 11762 1726853269.64790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853269.66176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853269.66180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.66183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.66187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853269.66189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853269.66192: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853269.66194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.66196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853269.66204: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853269.66210: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853269.66218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.66227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.66238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853269.66250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853269.66323: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853269.66901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.66911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853269.68730: stdout chunk (state=3): >>>/root <<< 11762 1726853269.68825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853269.68884: stderr chunk (state=3): >>><<< 11762 1726853269.68970: stdout chunk (state=3): >>><<< 11762 1726853269.69000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853269.69015: _low_level_execute_command(): starting 11762 1726853269.69022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268 `" && echo ansible-tmp-1726853269.689991-12684-149812386169268="` echo /root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268 `" ) && sleep 0' 11762 1726853269.70269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853269.70381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.70391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.70406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853269.70420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853269.70427: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853269.70436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.70451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853269.70459: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853269.70466: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853269.70480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.70490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.70502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853269.70510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853269.70652: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853269.70698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.70808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853269.73088: stdout chunk (state=3): >>>ansible-tmp-1726853269.689991-12684-149812386169268=/root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268 <<< 11762 1726853269.73146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853269.73149: stdout chunk (state=3): >>><<< 11762 1726853269.73155: stderr chunk (state=3): >>><<< 11762 1726853269.73180: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853269.689991-12684-149812386169268=/root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853269.73218: variable 'ansible_module_compression' from source: unknown 11762 1726853269.73268: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853269.73476: variable 'ansible_facts' from source: unknown 11762 1726853269.73590: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268/AnsiballZ_command.py 11762 1726853269.73890: Sending initial data 11762 1726853269.73894: Sent initial data (155 bytes) 11762 1726853269.75276: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853269.75477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.75693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853269.77288: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11762 1726853269.77297: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11762 1726853269.77304: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11762 1726853269.77312: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11762 1726853269.77328: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853269.77420: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853269.77656: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp9f5jky_8 /root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268/AnsiballZ_command.py <<< 11762 1726853269.77659: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268/AnsiballZ_command.py" <<< 11762 1726853269.77870: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp9f5jky_8" to remote "/root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268/AnsiballZ_command.py" <<< 11762 1726853269.79304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853269.79332: stderr chunk (state=3): >>><<< 11762 1726853269.79335: stdout chunk (state=3): >>><<< 11762 1726853269.79390: done transferring module to remote 11762 1726853269.79401: _low_level_execute_command(): starting 11762 1726853269.79406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268/ /root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268/AnsiballZ_command.py && sleep 0' 11762 1726853269.80488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.80492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853269.80496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853269.80498: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.80500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.80714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853269.80725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853269.80896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.80987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853269.83181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853269.83185: stdout chunk (state=3): >>><<< 11762 1726853269.83187: stderr chunk (state=3): >>><<< 11762 1726853269.83191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853269.83194: _low_level_execute_command(): starting 11762 1726853269.83196: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268/AnsiballZ_command.py && sleep 0' 11762 1726853269.84858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.84864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.84867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853269.84887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853269.84920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853269.84933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853269.85106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853269.85190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853269.85401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853270.08394: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 13:27:50.011059", "end": "2024-09-20 13:27:50.082757", "delta": "0:00:00.071698", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853270.10179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853270.10222: stderr chunk (state=3): >>><<< 11762 1726853270.10225: stdout chunk (state=3): >>><<< 11762 1726853270.10355: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 13:27:50.011059", "end": "2024-09-20 13:27:50.082757", "delta": "0:00:00.071698", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853270.10364: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853270.10367: _low_level_execute_command(): starting 11762 1726853270.10369: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853269.689991-12684-149812386169268/ > /dev/null 2>&1 && sleep 0' 11762 1726853270.11597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853270.11675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853270.11689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853270.11876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853270.13795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853270.13831: stderr chunk (state=3): >>><<< 11762 1726853270.13840: stdout chunk (state=3): >>><<< 11762 1726853270.14177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853270.14181: handler run complete 11762 1726853270.14184: Evaluated conditional (False): False 11762 1726853270.14186: attempt loop complete, returning result 11762 1726853270.14188: _execute() done 11762 1726853270.14190: dumping result to json 11762 1726853270.14192: done dumping result, returning 11762 1726853270.14194: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [02083763-bbaf-d845-03d0-0000000004ff] 11762 1726853270.14196: sending task result for task 02083763-bbaf-d845-03d0-0000000004ff 11762 1726853270.14267: done sending task result for task 02083763-bbaf-d845-03d0-0000000004ff 11762 1726853270.14274: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.071698", "end": "2024-09-20 13:27:50.082757", "rc": 0, "start": "2024-09-20 13:27:50.011059" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 11762 1726853270.14357: no more pending results, returning what we have 11762 1726853270.14361: results queue empty 11762 1726853270.14362: checking for any_errors_fatal 11762 1726853270.14578: done checking for any_errors_fatal 11762 1726853270.14579: checking for max_fail_percentage 11762 1726853270.14581: done checking for max_fail_percentage 11762 1726853270.14582: checking to see if all hosts have failed and the running result is not ok 11762 1726853270.14583: done checking to see if all hosts have failed 11762 1726853270.14584: getting the remaining hosts for this loop 11762 1726853270.14585: done getting the remaining hosts for this loop 11762 1726853270.14589: getting the next task for host managed_node2 11762 1726853270.14596: done getting next task for host managed_node2 11762 1726853270.14598: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11762 1726853270.14604: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853270.14607: getting variables 11762 1726853270.14608: in VariableManager get_vars() 11762 1726853270.14639: Calling all_inventory to load vars for managed_node2 11762 1726853270.14641: Calling groups_inventory to load vars for managed_node2 11762 1726853270.14647: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853270.14658: Calling all_plugins_play to load vars for managed_node2 11762 1726853270.14661: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853270.14664: Calling groups_plugins_play to load vars for managed_node2 11762 1726853270.17309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853270.20782: done with get_vars() 11762 1726853270.20931: done getting variables 11762 1726853270.20997: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:27:50 -0400 (0:00:00.599) 0:00:20.640 ****** 11762 1726853270.21077: entering _queue_task() for managed_node2/set_fact 11762 1726853270.21840: worker is 1 (out of 1 available) 11762 1726853270.21855: exiting _queue_task() for managed_node2/set_fact 11762 1726853270.21867: done queuing things up, now waiting for results queue to drain 11762 1726853270.21868: waiting for pending results... 11762 1726853270.22589: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11762 1726853270.22596: in run() - task 02083763-bbaf-d845-03d0-000000000500 11762 1726853270.22599: variable 'ansible_search_path' from source: unknown 11762 1726853270.22602: variable 'ansible_search_path' from source: unknown 11762 1726853270.22978: calling self._execute() 11762 1726853270.22982: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.22985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.22989: variable 'omit' from source: magic vars 11762 1726853270.23683: variable 'ansible_distribution_major_version' from source: facts 11762 1726853270.23701: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853270.23998: variable 'nm_profile_exists' from source: set_fact 11762 1726853270.24016: Evaluated conditional (nm_profile_exists.rc == 0): True 11762 1726853270.24029: variable 'omit' from source: magic vars 11762 1726853270.24095: variable 'omit' from source: magic vars 11762 1726853270.24211: variable 'omit' from source: magic vars 11762 1726853270.24423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853270.24465: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853270.24492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853270.24514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853270.24532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853270.24566: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853270.24876: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.24879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.24894: Set connection var ansible_timeout to 10 11762 1726853270.24902: Set connection var ansible_shell_type to sh 11762 1726853270.24913: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853270.24924: Set connection var ansible_shell_executable to /bin/sh 11762 1726853270.24938: Set connection var ansible_pipelining to False 11762 1726853270.24949: Set connection var ansible_connection to ssh 11762 1726853270.24981: variable 'ansible_shell_executable' from source: unknown 11762 1726853270.24990: variable 'ansible_connection' from source: unknown 11762 1726853270.24997: variable 'ansible_module_compression' from source: unknown 11762 1726853270.25004: variable 'ansible_shell_type' from source: unknown 11762 1726853270.25009: variable 'ansible_shell_executable' from source: unknown 11762 1726853270.25015: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.25023: variable 'ansible_pipelining' from source: unknown 11762 1726853270.25376: variable 'ansible_timeout' from source: unknown 11762 1726853270.25380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.25577: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853270.25581: variable 'omit' from source: magic vars 11762 1726853270.25583: starting attempt loop 11762 1726853270.25585: running the handler 11762 1726853270.25587: handler run complete 11762 1726853270.25589: attempt loop complete, returning result 11762 1726853270.25591: _execute() done 11762 1726853270.25593: dumping result to json 11762 1726853270.25595: done dumping result, returning 11762 1726853270.25597: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-d845-03d0-000000000500] 11762 1726853270.25599: sending task result for task 02083763-bbaf-d845-03d0-000000000500 11762 1726853270.25675: done sending task result for task 02083763-bbaf-d845-03d0-000000000500 11762 1726853270.25679: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11762 1726853270.25740: no more pending results, returning what we have 11762 1726853270.25746: results queue empty 11762 1726853270.25747: checking for any_errors_fatal 11762 1726853270.25758: done checking for any_errors_fatal 11762 1726853270.25759: checking for max_fail_percentage 11762 1726853270.25761: done checking for max_fail_percentage 11762 1726853270.25762: checking to see if all hosts have failed and the running result is not ok 11762 1726853270.25762: done checking to see if all hosts have failed 11762 1726853270.25763: getting the remaining hosts for this loop 11762 1726853270.25765: done getting the remaining hosts for this loop 11762 1726853270.25770: getting the next task for host managed_node2 11762 1726853270.25782: done getting next task for host managed_node2 11762 1726853270.25784: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11762 1726853270.25789: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853270.25793: getting variables 11762 1726853270.25794: in VariableManager get_vars() 11762 1726853270.25828: Calling all_inventory to load vars for managed_node2 11762 1726853270.25830: Calling groups_inventory to load vars for managed_node2 11762 1726853270.25833: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853270.25846: Calling all_plugins_play to load vars for managed_node2 11762 1726853270.25849: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853270.25851: Calling groups_plugins_play to load vars for managed_node2 11762 1726853270.28991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853270.32463: done with get_vars() 11762 1726853270.32505: done getting variables 11762 1726853270.32618: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853270.32969: variable 'profile' from source: include params 11762 1726853270.32975: variable 'bond_port_profile' from source: include params 11762 1726853270.33035: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:27:50 -0400 (0:00:00.120) 0:00:20.761 ****** 11762 1726853270.33185: entering _queue_task() for managed_node2/command 11762 1726853270.33947: worker is 1 (out of 1 available) 11762 1726853270.33961: exiting _queue_task() for managed_node2/command 11762 1726853270.33977: done queuing things up, now waiting for results queue to drain 11762 1726853270.33979: waiting for pending results... 11762 1726853270.34590: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0 11762 1726853270.34596: in run() - task 02083763-bbaf-d845-03d0-000000000502 11762 1726853270.34977: variable 'ansible_search_path' from source: unknown 11762 1726853270.34981: variable 'ansible_search_path' from source: unknown 11762 1726853270.34984: calling self._execute() 11762 1726853270.34987: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.34990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.34992: variable 'omit' from source: magic vars 11762 1726853270.35694: variable 'ansible_distribution_major_version' from source: facts 11762 1726853270.35710: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853270.36036: variable 'profile_stat' from source: set_fact 11762 1726853270.36054: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853270.36061: when evaluation is False, skipping this task 11762 1726853270.36068: _execute() done 11762 1726853270.36078: dumping result to json 11762 1726853270.36085: done dumping result, returning 11762 1726853270.36097: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0 [02083763-bbaf-d845-03d0-000000000502] 11762 1726853270.36109: sending task result for task 02083763-bbaf-d845-03d0-000000000502 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853270.36384: no more pending results, returning what we have 11762 1726853270.36389: results queue empty 11762 1726853270.36390: checking for any_errors_fatal 11762 1726853270.36398: done checking for any_errors_fatal 11762 1726853270.36399: checking for max_fail_percentage 11762 1726853270.36401: done checking for max_fail_percentage 11762 1726853270.36402: checking to see if all hosts have failed and the running result is not ok 11762 1726853270.36402: done checking to see if all hosts have failed 11762 1726853270.36403: getting the remaining hosts for this loop 11762 1726853270.36405: done getting the remaining hosts for this loop 11762 1726853270.36409: getting the next task for host managed_node2 11762 1726853270.36420: done getting next task for host managed_node2 11762 1726853270.36423: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11762 1726853270.36428: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853270.36433: getting variables 11762 1726853270.36434: in VariableManager get_vars() 11762 1726853270.36586: Calling all_inventory to load vars for managed_node2 11762 1726853270.36589: Calling groups_inventory to load vars for managed_node2 11762 1726853270.36592: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853270.36606: Calling all_plugins_play to load vars for managed_node2 11762 1726853270.36609: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853270.36612: Calling groups_plugins_play to load vars for managed_node2 11762 1726853270.37225: done sending task result for task 02083763-bbaf-d845-03d0-000000000502 11762 1726853270.37229: WORKER PROCESS EXITING 11762 1726853270.39498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853270.42796: done with get_vars() 11762 1726853270.42912: done getting variables 11762 1726853270.42980: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853270.43219: variable 'profile' from source: include params 11762 1726853270.43224: variable 'bond_port_profile' from source: include params 11762 1726853270.43402: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:27:50 -0400 (0:00:00.103) 0:00:20.864 ****** 11762 1726853270.43436: entering _queue_task() for managed_node2/set_fact 11762 1726853270.44263: worker is 1 (out of 1 available) 11762 1726853270.44334: exiting _queue_task() for managed_node2/set_fact 11762 1726853270.44350: done queuing things up, now waiting for results queue to drain 11762 1726853270.44352: waiting for pending results... 11762 1726853270.44670: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 11762 1726853270.45178: in run() - task 02083763-bbaf-d845-03d0-000000000503 11762 1726853270.45182: variable 'ansible_search_path' from source: unknown 11762 1726853270.45185: variable 'ansible_search_path' from source: unknown 11762 1726853270.45188: calling self._execute() 11762 1726853270.45192: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.45194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.45197: variable 'omit' from source: magic vars 11762 1726853270.45933: variable 'ansible_distribution_major_version' from source: facts 11762 1726853270.46377: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853270.46381: variable 'profile_stat' from source: set_fact 11762 1726853270.46383: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853270.46386: when evaluation is False, skipping this task 11762 1726853270.46388: _execute() done 11762 1726853270.46390: dumping result to json 11762 1726853270.46392: done dumping result, returning 11762 1726853270.46395: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 [02083763-bbaf-d845-03d0-000000000503] 11762 1726853270.46397: sending task result for task 02083763-bbaf-d845-03d0-000000000503 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853270.46634: no more pending results, returning what we have 11762 1726853270.46638: results queue empty 11762 1726853270.46639: checking for any_errors_fatal 11762 1726853270.46648: done checking for any_errors_fatal 11762 1726853270.46649: checking for max_fail_percentage 11762 1726853270.46651: done checking for max_fail_percentage 11762 1726853270.46652: checking to see if all hosts have failed and the running result is not ok 11762 1726853270.46652: done checking to see if all hosts have failed 11762 1726853270.46653: getting the remaining hosts for this loop 11762 1726853270.46655: done getting the remaining hosts for this loop 11762 1726853270.46659: getting the next task for host managed_node2 11762 1726853270.46667: done getting next task for host managed_node2 11762 1726853270.46670: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11762 1726853270.46678: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853270.46682: getting variables 11762 1726853270.46684: in VariableManager get_vars() 11762 1726853270.46718: Calling all_inventory to load vars for managed_node2 11762 1726853270.46720: Calling groups_inventory to load vars for managed_node2 11762 1726853270.46837: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853270.46853: Calling all_plugins_play to load vars for managed_node2 11762 1726853270.46856: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853270.46858: Calling groups_plugins_play to load vars for managed_node2 11762 1726853270.47579: done sending task result for task 02083763-bbaf-d845-03d0-000000000503 11762 1726853270.47583: WORKER PROCESS EXITING 11762 1726853270.49760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853270.53225: done with get_vars() 11762 1726853270.53374: done getting variables 11762 1726853270.53436: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853270.53776: variable 'profile' from source: include params 11762 1726853270.53781: variable 'bond_port_profile' from source: include params 11762 1726853270.53851: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:27:50 -0400 (0:00:00.104) 0:00:20.969 ****** 11762 1726853270.53887: entering _queue_task() for managed_node2/command 11762 1726853270.54664: worker is 1 (out of 1 available) 11762 1726853270.54793: exiting _queue_task() for managed_node2/command 11762 1726853270.54806: done queuing things up, now waiting for results queue to drain 11762 1726853270.54808: waiting for pending results... 11762 1726853270.55259: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0 11762 1726853270.55777: in run() - task 02083763-bbaf-d845-03d0-000000000504 11762 1726853270.55781: variable 'ansible_search_path' from source: unknown 11762 1726853270.55785: variable 'ansible_search_path' from source: unknown 11762 1726853270.55788: calling self._execute() 11762 1726853270.55790: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.55793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.55796: variable 'omit' from source: magic vars 11762 1726853270.56520: variable 'ansible_distribution_major_version' from source: facts 11762 1726853270.56538: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853270.56662: variable 'profile_stat' from source: set_fact 11762 1726853270.56890: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853270.56899: when evaluation is False, skipping this task 11762 1726853270.56906: _execute() done 11762 1726853270.56913: dumping result to json 11762 1726853270.56920: done dumping result, returning 11762 1726853270.56931: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0 [02083763-bbaf-d845-03d0-000000000504] 11762 1726853270.56941: sending task result for task 02083763-bbaf-d845-03d0-000000000504 11762 1726853270.57045: done sending task result for task 02083763-bbaf-d845-03d0-000000000504 11762 1726853270.57052: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853270.57122: no more pending results, returning what we have 11762 1726853270.57127: results queue empty 11762 1726853270.57127: checking for any_errors_fatal 11762 1726853270.57138: done checking for any_errors_fatal 11762 1726853270.57138: checking for max_fail_percentage 11762 1726853270.57140: done checking for max_fail_percentage 11762 1726853270.57141: checking to see if all hosts have failed and the running result is not ok 11762 1726853270.57141: done checking to see if all hosts have failed 11762 1726853270.57144: getting the remaining hosts for this loop 11762 1726853270.57146: done getting the remaining hosts for this loop 11762 1726853270.57150: getting the next task for host managed_node2 11762 1726853270.57158: done getting next task for host managed_node2 11762 1726853270.57160: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11762 1726853270.57165: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853270.57169: getting variables 11762 1726853270.57170: in VariableManager get_vars() 11762 1726853270.57210: Calling all_inventory to load vars for managed_node2 11762 1726853270.57213: Calling groups_inventory to load vars for managed_node2 11762 1726853270.57217: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853270.57231: Calling all_plugins_play to load vars for managed_node2 11762 1726853270.57234: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853270.57237: Calling groups_plugins_play to load vars for managed_node2 11762 1726853270.60128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853270.63801: done with get_vars() 11762 1726853270.63836: done getting variables 11762 1726853270.63903: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853270.64206: variable 'profile' from source: include params 11762 1726853270.64210: variable 'bond_port_profile' from source: include params 11762 1726853270.64390: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:27:50 -0400 (0:00:00.105) 0:00:21.074 ****** 11762 1726853270.64425: entering _queue_task() for managed_node2/set_fact 11762 1726853270.65346: worker is 1 (out of 1 available) 11762 1726853270.65357: exiting _queue_task() for managed_node2/set_fact 11762 1726853270.65367: done queuing things up, now waiting for results queue to drain 11762 1726853270.65369: waiting for pending results... 11762 1726853270.65555: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0 11762 1726853270.65898: in run() - task 02083763-bbaf-d845-03d0-000000000505 11762 1726853270.65917: variable 'ansible_search_path' from source: unknown 11762 1726853270.65926: variable 'ansible_search_path' from source: unknown 11762 1726853270.65965: calling self._execute() 11762 1726853270.66269: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.66475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.66479: variable 'omit' from source: magic vars 11762 1726853270.66850: variable 'ansible_distribution_major_version' from source: facts 11762 1726853270.67276: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853270.67280: variable 'profile_stat' from source: set_fact 11762 1726853270.67283: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853270.67285: when evaluation is False, skipping this task 11762 1726853270.67287: _execute() done 11762 1726853270.67289: dumping result to json 11762 1726853270.67291: done dumping result, returning 11762 1726853270.67294: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0 [02083763-bbaf-d845-03d0-000000000505] 11762 1726853270.67296: sending task result for task 02083763-bbaf-d845-03d0-000000000505 11762 1726853270.67365: done sending task result for task 02083763-bbaf-d845-03d0-000000000505 11762 1726853270.67370: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853270.67422: no more pending results, returning what we have 11762 1726853270.67427: results queue empty 11762 1726853270.67428: checking for any_errors_fatal 11762 1726853270.67434: done checking for any_errors_fatal 11762 1726853270.67435: checking for max_fail_percentage 11762 1726853270.67437: done checking for max_fail_percentage 11762 1726853270.67437: checking to see if all hosts have failed and the running result is not ok 11762 1726853270.67438: done checking to see if all hosts have failed 11762 1726853270.67439: getting the remaining hosts for this loop 11762 1726853270.67441: done getting the remaining hosts for this loop 11762 1726853270.67452: getting the next task for host managed_node2 11762 1726853270.67462: done getting next task for host managed_node2 11762 1726853270.67465: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11762 1726853270.67472: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853270.67477: getting variables 11762 1726853270.67478: in VariableManager get_vars() 11762 1726853270.67513: Calling all_inventory to load vars for managed_node2 11762 1726853270.67516: Calling groups_inventory to load vars for managed_node2 11762 1726853270.67519: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853270.67533: Calling all_plugins_play to load vars for managed_node2 11762 1726853270.67536: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853270.67539: Calling groups_plugins_play to load vars for managed_node2 11762 1726853270.70523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853270.72378: done with get_vars() 11762 1726853270.72407: done getting variables 11762 1726853270.72467: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853270.72593: variable 'profile' from source: include params 11762 1726853270.72597: variable 'bond_port_profile' from source: include params 11762 1726853270.72653: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:27:50 -0400 (0:00:00.082) 0:00:21.157 ****** 11762 1726853270.72693: entering _queue_task() for managed_node2/assert 11762 1726853270.73328: worker is 1 (out of 1 available) 11762 1726853270.73341: exiting _queue_task() for managed_node2/assert 11762 1726853270.73354: done queuing things up, now waiting for results queue to drain 11762 1726853270.73355: waiting for pending results... 11762 1726853270.73693: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0' 11762 1726853270.73937: in run() - task 02083763-bbaf-d845-03d0-0000000004da 11762 1726853270.73955: variable 'ansible_search_path' from source: unknown 11762 1726853270.73959: variable 'ansible_search_path' from source: unknown 11762 1726853270.74097: calling self._execute() 11762 1726853270.74257: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.74265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.74275: variable 'omit' from source: magic vars 11762 1726853270.75060: variable 'ansible_distribution_major_version' from source: facts 11762 1726853270.75072: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853270.75079: variable 'omit' from source: magic vars 11762 1726853270.75148: variable 'omit' from source: magic vars 11762 1726853270.75256: variable 'profile' from source: include params 11762 1726853270.75260: variable 'bond_port_profile' from source: include params 11762 1726853270.75327: variable 'bond_port_profile' from source: include params 11762 1726853270.75356: variable 'omit' from source: magic vars 11762 1726853270.75397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853270.75436: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853270.75464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853270.75484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853270.75497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853270.75528: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853270.75531: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.75541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.75644: Set connection var ansible_timeout to 10 11762 1726853270.75655: Set connection var ansible_shell_type to sh 11762 1726853270.75661: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853270.75673: Set connection var ansible_shell_executable to /bin/sh 11762 1726853270.75682: Set connection var ansible_pipelining to False 11762 1726853270.75689: Set connection var ansible_connection to ssh 11762 1726853270.75713: variable 'ansible_shell_executable' from source: unknown 11762 1726853270.75716: variable 'ansible_connection' from source: unknown 11762 1726853270.75719: variable 'ansible_module_compression' from source: unknown 11762 1726853270.75721: variable 'ansible_shell_type' from source: unknown 11762 1726853270.75723: variable 'ansible_shell_executable' from source: unknown 11762 1726853270.75725: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.75731: variable 'ansible_pipelining' from source: unknown 11762 1726853270.75733: variable 'ansible_timeout' from source: unknown 11762 1726853270.75735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.75910: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853270.75920: variable 'omit' from source: magic vars 11762 1726853270.75977: starting attempt loop 11762 1726853270.75980: running the handler 11762 1726853270.76046: variable 'lsr_net_profile_exists' from source: set_fact 11762 1726853270.76054: Evaluated conditional (lsr_net_profile_exists): True 11762 1726853270.76060: handler run complete 11762 1726853270.76086: attempt loop complete, returning result 11762 1726853270.76089: _execute() done 11762 1726853270.76092: dumping result to json 11762 1726853270.76095: done dumping result, returning 11762 1726853270.76109: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0' [02083763-bbaf-d845-03d0-0000000004da] 11762 1726853270.76115: sending task result for task 02083763-bbaf-d845-03d0-0000000004da 11762 1726853270.76251: done sending task result for task 02083763-bbaf-d845-03d0-0000000004da 11762 1726853270.76254: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853270.76344: no more pending results, returning what we have 11762 1726853270.76348: results queue empty 11762 1726853270.76349: checking for any_errors_fatal 11762 1726853270.76355: done checking for any_errors_fatal 11762 1726853270.76356: checking for max_fail_percentage 11762 1726853270.76357: done checking for max_fail_percentage 11762 1726853270.76358: checking to see if all hosts have failed and the running result is not ok 11762 1726853270.76359: done checking to see if all hosts have failed 11762 1726853270.76359: getting the remaining hosts for this loop 11762 1726853270.76361: done getting the remaining hosts for this loop 11762 1726853270.76364: getting the next task for host managed_node2 11762 1726853270.76372: done getting next task for host managed_node2 11762 1726853270.76375: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11762 1726853270.76379: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853270.76383: getting variables 11762 1726853270.76384: in VariableManager get_vars() 11762 1726853270.76413: Calling all_inventory to load vars for managed_node2 11762 1726853270.76416: Calling groups_inventory to load vars for managed_node2 11762 1726853270.76419: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853270.76428: Calling all_plugins_play to load vars for managed_node2 11762 1726853270.76431: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853270.76434: Calling groups_plugins_play to load vars for managed_node2 11762 1726853270.79707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853270.83236: done with get_vars() 11762 1726853270.83389: done getting variables 11762 1726853270.83455: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853270.83779: variable 'profile' from source: include params 11762 1726853270.83783: variable 'bond_port_profile' from source: include params 11762 1726853270.83853: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:27:50 -0400 (0:00:00.112) 0:00:21.270 ****** 11762 1726853270.83991: entering _queue_task() for managed_node2/assert 11762 1726853270.84732: worker is 1 (out of 1 available) 11762 1726853270.84749: exiting _queue_task() for managed_node2/assert 11762 1726853270.84762: done queuing things up, now waiting for results queue to drain 11762 1726853270.84763: waiting for pending results... 11762 1726853270.85289: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0' 11762 1726853270.85677: in run() - task 02083763-bbaf-d845-03d0-0000000004db 11762 1726853270.85681: variable 'ansible_search_path' from source: unknown 11762 1726853270.85684: variable 'ansible_search_path' from source: unknown 11762 1726853270.85686: calling self._execute() 11762 1726853270.85689: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.85692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.85694: variable 'omit' from source: magic vars 11762 1726853270.86405: variable 'ansible_distribution_major_version' from source: facts 11762 1726853270.86676: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853270.86680: variable 'omit' from source: magic vars 11762 1726853270.86683: variable 'omit' from source: magic vars 11762 1726853270.86765: variable 'profile' from source: include params 11762 1726853270.87076: variable 'bond_port_profile' from source: include params 11762 1726853270.87079: variable 'bond_port_profile' from source: include params 11762 1726853270.87082: variable 'omit' from source: magic vars 11762 1726853270.87116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853270.87156: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853270.87476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853270.87479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853270.87482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853270.87484: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853270.87486: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.87488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.87575: Set connection var ansible_timeout to 10 11762 1726853270.87584: Set connection var ansible_shell_type to sh 11762 1726853270.87876: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853270.87880: Set connection var ansible_shell_executable to /bin/sh 11762 1726853270.87883: Set connection var ansible_pipelining to False 11762 1726853270.87886: Set connection var ansible_connection to ssh 11762 1726853270.87888: variable 'ansible_shell_executable' from source: unknown 11762 1726853270.87890: variable 'ansible_connection' from source: unknown 11762 1726853270.87892: variable 'ansible_module_compression' from source: unknown 11762 1726853270.87895: variable 'ansible_shell_type' from source: unknown 11762 1726853270.87898: variable 'ansible_shell_executable' from source: unknown 11762 1726853270.87900: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.87902: variable 'ansible_pipelining' from source: unknown 11762 1726853270.87905: variable 'ansible_timeout' from source: unknown 11762 1726853270.87907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.88025: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853270.88376: variable 'omit' from source: magic vars 11762 1726853270.88379: starting attempt loop 11762 1726853270.88382: running the handler 11762 1726853270.88422: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11762 1726853270.88432: Evaluated conditional (lsr_net_profile_ansible_managed): True 11762 1726853270.88442: handler run complete 11762 1726853270.88461: attempt loop complete, returning result 11762 1726853270.88468: _execute() done 11762 1726853270.88477: dumping result to json 11762 1726853270.88486: done dumping result, returning 11762 1726853270.88498: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0' [02083763-bbaf-d845-03d0-0000000004db] 11762 1726853270.88509: sending task result for task 02083763-bbaf-d845-03d0-0000000004db ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853270.88664: no more pending results, returning what we have 11762 1726853270.88668: results queue empty 11762 1726853270.88669: checking for any_errors_fatal 11762 1726853270.88678: done checking for any_errors_fatal 11762 1726853270.88679: checking for max_fail_percentage 11762 1726853270.88681: done checking for max_fail_percentage 11762 1726853270.88681: checking to see if all hosts have failed and the running result is not ok 11762 1726853270.88682: done checking to see if all hosts have failed 11762 1726853270.88683: getting the remaining hosts for this loop 11762 1726853270.88685: done getting the remaining hosts for this loop 11762 1726853270.88689: getting the next task for host managed_node2 11762 1726853270.88697: done getting next task for host managed_node2 11762 1726853270.88699: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11762 1726853270.88704: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853270.88708: getting variables 11762 1726853270.88709: in VariableManager get_vars() 11762 1726853270.88746: Calling all_inventory to load vars for managed_node2 11762 1726853270.88748: Calling groups_inventory to load vars for managed_node2 11762 1726853270.88752: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853270.88764: Calling all_plugins_play to load vars for managed_node2 11762 1726853270.88767: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853270.88886: Calling groups_plugins_play to load vars for managed_node2 11762 1726853270.89408: done sending task result for task 02083763-bbaf-d845-03d0-0000000004db 11762 1726853270.89413: WORKER PROCESS EXITING 11762 1726853270.91768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853270.95312: done with get_vars() 11762 1726853270.95467: done getting variables 11762 1726853270.95532: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853270.95769: variable 'profile' from source: include params 11762 1726853270.95897: variable 'bond_port_profile' from source: include params 11762 1726853270.95958: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:27:50 -0400 (0:00:00.120) 0:00:21.390 ****** 11762 1726853270.96023: entering _queue_task() for managed_node2/assert 11762 1726853270.96831: worker is 1 (out of 1 available) 11762 1726853270.96847: exiting _queue_task() for managed_node2/assert 11762 1726853270.96860: done queuing things up, now waiting for results queue to drain 11762 1726853270.96862: waiting for pending results... 11762 1726853270.97310: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0 11762 1726853270.97646: in run() - task 02083763-bbaf-d845-03d0-0000000004dc 11762 1726853270.97650: variable 'ansible_search_path' from source: unknown 11762 1726853270.97653: variable 'ansible_search_path' from source: unknown 11762 1726853270.97674: calling self._execute() 11762 1726853270.97882: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.97893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.97905: variable 'omit' from source: magic vars 11762 1726853270.98836: variable 'ansible_distribution_major_version' from source: facts 11762 1726853270.98840: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853270.98843: variable 'omit' from source: magic vars 11762 1726853270.98846: variable 'omit' from source: magic vars 11762 1726853270.99070: variable 'profile' from source: include params 11762 1726853270.99083: variable 'bond_port_profile' from source: include params 11762 1726853270.99148: variable 'bond_port_profile' from source: include params 11762 1726853270.99293: variable 'omit' from source: magic vars 11762 1726853270.99339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853270.99594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853270.99597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853270.99600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853270.99603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853270.99606: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853270.99608: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853270.99609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853270.99769: Set connection var ansible_timeout to 10 11762 1726853271.00028: Set connection var ansible_shell_type to sh 11762 1726853271.00032: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853271.00034: Set connection var ansible_shell_executable to /bin/sh 11762 1726853271.00037: Set connection var ansible_pipelining to False 11762 1726853271.00039: Set connection var ansible_connection to ssh 11762 1726853271.00041: variable 'ansible_shell_executable' from source: unknown 11762 1726853271.00043: variable 'ansible_connection' from source: unknown 11762 1726853271.00045: variable 'ansible_module_compression' from source: unknown 11762 1726853271.00047: variable 'ansible_shell_type' from source: unknown 11762 1726853271.00049: variable 'ansible_shell_executable' from source: unknown 11762 1726853271.00050: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.00052: variable 'ansible_pipelining' from source: unknown 11762 1726853271.00055: variable 'ansible_timeout' from source: unknown 11762 1726853271.00057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.00367: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853271.00386: variable 'omit' from source: magic vars 11762 1726853271.00396: starting attempt loop 11762 1726853271.00403: running the handler 11762 1726853271.00643: variable 'lsr_net_profile_fingerprint' from source: set_fact 11762 1726853271.00654: Evaluated conditional (lsr_net_profile_fingerprint): True 11762 1726853271.00663: handler run complete 11762 1726853271.00694: attempt loop complete, returning result 11762 1726853271.00895: _execute() done 11762 1726853271.00898: dumping result to json 11762 1726853271.00901: done dumping result, returning 11762 1726853271.00903: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0 [02083763-bbaf-d845-03d0-0000000004dc] 11762 1726853271.00905: sending task result for task 02083763-bbaf-d845-03d0-0000000004dc 11762 1726853271.00970: done sending task result for task 02083763-bbaf-d845-03d0-0000000004dc 11762 1726853271.00976: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853271.01108: no more pending results, returning what we have 11762 1726853271.01112: results queue empty 11762 1726853271.01112: checking for any_errors_fatal 11762 1726853271.01118: done checking for any_errors_fatal 11762 1726853271.01119: checking for max_fail_percentage 11762 1726853271.01121: done checking for max_fail_percentage 11762 1726853271.01121: checking to see if all hosts have failed and the running result is not ok 11762 1726853271.01122: done checking to see if all hosts have failed 11762 1726853271.01123: getting the remaining hosts for this loop 11762 1726853271.01124: done getting the remaining hosts for this loop 11762 1726853271.01127: getting the next task for host managed_node2 11762 1726853271.01137: done getting next task for host managed_node2 11762 1726853271.01139: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11762 1726853271.01146: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853271.01149: getting variables 11762 1726853271.01151: in VariableManager get_vars() 11762 1726853271.01385: Calling all_inventory to load vars for managed_node2 11762 1726853271.01388: Calling groups_inventory to load vars for managed_node2 11762 1726853271.01391: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853271.01400: Calling all_plugins_play to load vars for managed_node2 11762 1726853271.01403: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853271.01405: Calling groups_plugins_play to load vars for managed_node2 11762 1726853271.14745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853271.18216: done with get_vars() 11762 1726853271.18251: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:27:51 -0400 (0:00:00.224) 0:00:21.615 ****** 11762 1726853271.18457: entering _queue_task() for managed_node2/include_tasks 11762 1726853271.19199: worker is 1 (out of 1 available) 11762 1726853271.19214: exiting _queue_task() for managed_node2/include_tasks 11762 1726853271.19227: done queuing things up, now waiting for results queue to drain 11762 1726853271.19228: waiting for pending results... 11762 1726853271.19791: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 11762 1726853271.20146: in run() - task 02083763-bbaf-d845-03d0-0000000004e0 11762 1726853271.20151: variable 'ansible_search_path' from source: unknown 11762 1726853271.20154: variable 'ansible_search_path' from source: unknown 11762 1726853271.20182: calling self._execute() 11762 1726853271.20343: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.20481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.20495: variable 'omit' from source: magic vars 11762 1726853271.21213: variable 'ansible_distribution_major_version' from source: facts 11762 1726853271.21287: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853271.21299: _execute() done 11762 1726853271.21308: dumping result to json 11762 1726853271.21316: done dumping result, returning 11762 1726853271.21329: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-d845-03d0-0000000004e0] 11762 1726853271.21343: sending task result for task 02083763-bbaf-d845-03d0-0000000004e0 11762 1726853271.21736: done sending task result for task 02083763-bbaf-d845-03d0-0000000004e0 11762 1726853271.21740: WORKER PROCESS EXITING 11762 1726853271.21789: no more pending results, returning what we have 11762 1726853271.21794: in VariableManager get_vars() 11762 1726853271.21834: Calling all_inventory to load vars for managed_node2 11762 1726853271.21836: Calling groups_inventory to load vars for managed_node2 11762 1726853271.21840: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853271.21854: Calling all_plugins_play to load vars for managed_node2 11762 1726853271.21857: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853271.21860: Calling groups_plugins_play to load vars for managed_node2 11762 1726853271.24055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853271.25913: done with get_vars() 11762 1726853271.25934: variable 'ansible_search_path' from source: unknown 11762 1726853271.25936: variable 'ansible_search_path' from source: unknown 11762 1726853271.25985: we have included files to process 11762 1726853271.25987: generating all_blocks data 11762 1726853271.25989: done generating all_blocks data 11762 1726853271.25994: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11762 1726853271.25995: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11762 1726853271.25997: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11762 1726853271.27396: done processing included file 11762 1726853271.27398: iterating over new_blocks loaded from include file 11762 1726853271.27399: in VariableManager get_vars() 11762 1726853271.27418: done with get_vars() 11762 1726853271.27421: filtering new block on tags 11762 1726853271.27505: done filtering new block on tags 11762 1726853271.27508: in VariableManager get_vars() 11762 1726853271.27523: done with get_vars() 11762 1726853271.27524: filtering new block on tags 11762 1726853271.27583: done filtering new block on tags 11762 1726853271.27591: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 11762 1726853271.27597: extending task lists for all hosts with included blocks 11762 1726853271.28214: done extending task lists 11762 1726853271.28215: done processing included files 11762 1726853271.28216: results queue empty 11762 1726853271.28216: checking for any_errors_fatal 11762 1726853271.28222: done checking for any_errors_fatal 11762 1726853271.28223: checking for max_fail_percentage 11762 1726853271.28224: done checking for max_fail_percentage 11762 1726853271.28225: checking to see if all hosts have failed and the running result is not ok 11762 1726853271.28225: done checking to see if all hosts have failed 11762 1726853271.28226: getting the remaining hosts for this loop 11762 1726853271.28227: done getting the remaining hosts for this loop 11762 1726853271.28230: getting the next task for host managed_node2 11762 1726853271.28234: done getting next task for host managed_node2 11762 1726853271.28236: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11762 1726853271.28289: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853271.28292: getting variables 11762 1726853271.28294: in VariableManager get_vars() 11762 1726853271.28304: Calling all_inventory to load vars for managed_node2 11762 1726853271.28306: Calling groups_inventory to load vars for managed_node2 11762 1726853271.28308: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853271.28315: Calling all_plugins_play to load vars for managed_node2 11762 1726853271.28317: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853271.28320: Calling groups_plugins_play to load vars for managed_node2 11762 1726853271.29656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853271.31332: done with get_vars() 11762 1726853271.31359: done getting variables 11762 1726853271.31417: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:27:51 -0400 (0:00:00.129) 0:00:21.744 ****** 11762 1726853271.31450: entering _queue_task() for managed_node2/set_fact 11762 1726853271.31830: worker is 1 (out of 1 available) 11762 1726853271.31847: exiting _queue_task() for managed_node2/set_fact 11762 1726853271.31859: done queuing things up, now waiting for results queue to drain 11762 1726853271.31860: waiting for pending results... 11762 1726853271.32134: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 11762 1726853271.32321: in run() - task 02083763-bbaf-d845-03d0-000000000558 11762 1726853271.32337: variable 'ansible_search_path' from source: unknown 11762 1726853271.32341: variable 'ansible_search_path' from source: unknown 11762 1726853271.32477: calling self._execute() 11762 1726853271.32485: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.32502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.32512: variable 'omit' from source: magic vars 11762 1726853271.33023: variable 'ansible_distribution_major_version' from source: facts 11762 1726853271.33027: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853271.33030: variable 'omit' from source: magic vars 11762 1726853271.33100: variable 'omit' from source: magic vars 11762 1726853271.33135: variable 'omit' from source: magic vars 11762 1726853271.33194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853271.33279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853271.33282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853271.33284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853271.33287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853271.33317: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853271.33321: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.33323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.33570: Set connection var ansible_timeout to 10 11762 1726853271.33575: Set connection var ansible_shell_type to sh 11762 1726853271.33577: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853271.33579: Set connection var ansible_shell_executable to /bin/sh 11762 1726853271.33581: Set connection var ansible_pipelining to False 11762 1726853271.33584: Set connection var ansible_connection to ssh 11762 1726853271.33587: variable 'ansible_shell_executable' from source: unknown 11762 1726853271.33589: variable 'ansible_connection' from source: unknown 11762 1726853271.33593: variable 'ansible_module_compression' from source: unknown 11762 1726853271.33595: variable 'ansible_shell_type' from source: unknown 11762 1726853271.33598: variable 'ansible_shell_executable' from source: unknown 11762 1726853271.33600: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.33603: variable 'ansible_pipelining' from source: unknown 11762 1726853271.33606: variable 'ansible_timeout' from source: unknown 11762 1726853271.33608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.33683: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853271.33700: variable 'omit' from source: magic vars 11762 1726853271.33705: starting attempt loop 11762 1726853271.33709: running the handler 11762 1726853271.33719: handler run complete 11762 1726853271.33737: attempt loop complete, returning result 11762 1726853271.33740: _execute() done 11762 1726853271.33745: dumping result to json 11762 1726853271.33748: done dumping result, returning 11762 1726853271.33750: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-d845-03d0-000000000558] 11762 1726853271.33757: sending task result for task 02083763-bbaf-d845-03d0-000000000558 11762 1726853271.33857: done sending task result for task 02083763-bbaf-d845-03d0-000000000558 11762 1726853271.33859: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11762 1726853271.34022: no more pending results, returning what we have 11762 1726853271.34025: results queue empty 11762 1726853271.34026: checking for any_errors_fatal 11762 1726853271.34027: done checking for any_errors_fatal 11762 1726853271.34028: checking for max_fail_percentage 11762 1726853271.34030: done checking for max_fail_percentage 11762 1726853271.34031: checking to see if all hosts have failed and the running result is not ok 11762 1726853271.34032: done checking to see if all hosts have failed 11762 1726853271.34032: getting the remaining hosts for this loop 11762 1726853271.34034: done getting the remaining hosts for this loop 11762 1726853271.34037: getting the next task for host managed_node2 11762 1726853271.34046: done getting next task for host managed_node2 11762 1726853271.34048: ^ task is: TASK: Stat profile file 11762 1726853271.34054: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853271.34057: getting variables 11762 1726853271.34059: in VariableManager get_vars() 11762 1726853271.34092: Calling all_inventory to load vars for managed_node2 11762 1726853271.34095: Calling groups_inventory to load vars for managed_node2 11762 1726853271.34098: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853271.34111: Calling all_plugins_play to load vars for managed_node2 11762 1726853271.34114: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853271.34117: Calling groups_plugins_play to load vars for managed_node2 11762 1726853271.35834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853271.37510: done with get_vars() 11762 1726853271.37595: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:27:51 -0400 (0:00:00.062) 0:00:21.807 ****** 11762 1726853271.37678: entering _queue_task() for managed_node2/stat 11762 1726853271.38045: worker is 1 (out of 1 available) 11762 1726853271.38064: exiting _queue_task() for managed_node2/stat 11762 1726853271.38079: done queuing things up, now waiting for results queue to drain 11762 1726853271.38080: waiting for pending results... 11762 1726853271.38492: running TaskExecutor() for managed_node2/TASK: Stat profile file 11762 1726853271.38518: in run() - task 02083763-bbaf-d845-03d0-000000000559 11762 1726853271.38541: variable 'ansible_search_path' from source: unknown 11762 1726853271.38552: variable 'ansible_search_path' from source: unknown 11762 1726853271.38605: calling self._execute() 11762 1726853271.38707: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.38727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.38747: variable 'omit' from source: magic vars 11762 1726853271.39114: variable 'ansible_distribution_major_version' from source: facts 11762 1726853271.39127: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853271.39130: variable 'omit' from source: magic vars 11762 1726853271.39189: variable 'omit' from source: magic vars 11762 1726853271.39305: variable 'profile' from source: include params 11762 1726853271.39320: variable 'bond_port_profile' from source: include params 11762 1726853271.39415: variable 'bond_port_profile' from source: include params 11762 1726853271.39455: variable 'omit' from source: magic vars 11762 1726853271.39606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853271.39767: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853271.39796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853271.39816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853271.39833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853271.39881: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853271.39952: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.39957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.40028: Set connection var ansible_timeout to 10 11762 1726853271.40037: Set connection var ansible_shell_type to sh 11762 1726853271.40048: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853271.40077: Set connection var ansible_shell_executable to /bin/sh 11762 1726853271.40276: Set connection var ansible_pipelining to False 11762 1726853271.40279: Set connection var ansible_connection to ssh 11762 1726853271.40281: variable 'ansible_shell_executable' from source: unknown 11762 1726853271.40283: variable 'ansible_connection' from source: unknown 11762 1726853271.40285: variable 'ansible_module_compression' from source: unknown 11762 1726853271.40287: variable 'ansible_shell_type' from source: unknown 11762 1726853271.40288: variable 'ansible_shell_executable' from source: unknown 11762 1726853271.40290: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.40292: variable 'ansible_pipelining' from source: unknown 11762 1726853271.40294: variable 'ansible_timeout' from source: unknown 11762 1726853271.40296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.40373: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853271.40392: variable 'omit' from source: magic vars 11762 1726853271.40402: starting attempt loop 11762 1726853271.40425: running the handler 11762 1726853271.40443: _low_level_execute_command(): starting 11762 1726853271.40456: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853271.41155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853271.41182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853271.41186: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.41189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853271.41210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.41247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853271.41256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.41343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853271.43118: stdout chunk (state=3): >>>/root <<< 11762 1726853271.43291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853271.43328: stderr chunk (state=3): >>><<< 11762 1726853271.43331: stdout chunk (state=3): >>><<< 11762 1726853271.43464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853271.43467: _low_level_execute_command(): starting 11762 1726853271.43470: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234 `" && echo ansible-tmp-1726853271.433654-12740-264808635270234="` echo /root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234 `" ) && sleep 0' 11762 1726853271.44007: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853271.44022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853271.44037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853271.44064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853271.44146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.44180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853271.44199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853271.44217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.44375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853271.46508: stdout chunk (state=3): >>>ansible-tmp-1726853271.433654-12740-264808635270234=/root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234 <<< 11762 1726853271.46794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853271.46798: stderr chunk (state=3): >>><<< 11762 1726853271.46800: stdout chunk (state=3): >>><<< 11762 1726853271.46803: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853271.433654-12740-264808635270234=/root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853271.46806: variable 'ansible_module_compression' from source: unknown 11762 1726853271.46847: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11762 1726853271.46881: variable 'ansible_facts' from source: unknown 11762 1726853271.46991: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234/AnsiballZ_stat.py 11762 1726853271.47289: Sending initial data 11762 1726853271.47292: Sent initial data (152 bytes) 11762 1726853271.47891: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853271.47909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853271.47931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853271.47947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853271.48015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.48097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853271.48100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853271.48130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.48306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853271.50055: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853271.50277: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853271.50295: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp9p45t2da /root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234/AnsiballZ_stat.py <<< 11762 1726853271.50298: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234/AnsiballZ_stat.py" <<< 11762 1726853271.50400: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp9p45t2da" to remote "/root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234/AnsiballZ_stat.py" <<< 11762 1726853271.51491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853271.51538: stderr chunk (state=3): >>><<< 11762 1726853271.51541: stdout chunk (state=3): >>><<< 11762 1726853271.51600: done transferring module to remote 11762 1726853271.51611: _low_level_execute_command(): starting 11762 1726853271.51618: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234/ /root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234/AnsiballZ_stat.py && sleep 0' 11762 1726853271.52437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853271.52450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853271.52485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853271.52647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853271.52652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853271.52659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.52724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853271.54908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853271.54912: stdout chunk (state=3): >>><<< 11762 1726853271.54914: stderr chunk (state=3): >>><<< 11762 1726853271.55029: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853271.55032: _low_level_execute_command(): starting 11762 1726853271.55069: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234/AnsiballZ_stat.py && sleep 0' 11762 1726853271.56467: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.56560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853271.56621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.56758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853271.72960: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11762 1726853271.74353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853271.74431: stderr chunk (state=3): >>><<< 11762 1726853271.74435: stdout chunk (state=3): >>><<< 11762 1726853271.74576: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853271.74580: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853271.74583: _low_level_execute_command(): starting 11762 1726853271.74585: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853271.433654-12740-264808635270234/ > /dev/null 2>&1 && sleep 0' 11762 1726853271.75158: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.75202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853271.75231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853271.75284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.75485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853271.77410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853271.77579: stderr chunk (state=3): >>><<< 11762 1726853271.77583: stdout chunk (state=3): >>><<< 11762 1726853271.77592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853271.77594: handler run complete 11762 1726853271.77596: attempt loop complete, returning result 11762 1726853271.77599: _execute() done 11762 1726853271.77601: dumping result to json 11762 1726853271.77603: done dumping result, returning 11762 1726853271.77605: done running TaskExecutor() for managed_node2/TASK: Stat profile file [02083763-bbaf-d845-03d0-000000000559] 11762 1726853271.77607: sending task result for task 02083763-bbaf-d845-03d0-000000000559 11762 1726853271.77690: done sending task result for task 02083763-bbaf-d845-03d0-000000000559 11762 1726853271.77693: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11762 1726853271.77762: no more pending results, returning what we have 11762 1726853271.77766: results queue empty 11762 1726853271.77767: checking for any_errors_fatal 11762 1726853271.77781: done checking for any_errors_fatal 11762 1726853271.77782: checking for max_fail_percentage 11762 1726853271.77784: done checking for max_fail_percentage 11762 1726853271.77785: checking to see if all hosts have failed and the running result is not ok 11762 1726853271.77786: done checking to see if all hosts have failed 11762 1726853271.77787: getting the remaining hosts for this loop 11762 1726853271.77789: done getting the remaining hosts for this loop 11762 1726853271.77793: getting the next task for host managed_node2 11762 1726853271.77846: done getting next task for host managed_node2 11762 1726853271.77850: ^ task is: TASK: Set NM profile exist flag based on the profile files 11762 1726853271.77884: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853271.77889: getting variables 11762 1726853271.77891: in VariableManager get_vars() 11762 1726853271.78099: Calling all_inventory to load vars for managed_node2 11762 1726853271.78103: Calling groups_inventory to load vars for managed_node2 11762 1726853271.78107: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853271.78120: Calling all_plugins_play to load vars for managed_node2 11762 1726853271.78157: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853271.78166: Calling groups_plugins_play to load vars for managed_node2 11762 1726853271.79162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853271.80446: done with get_vars() 11762 1726853271.80470: done getting variables 11762 1726853271.80531: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:27:51 -0400 (0:00:00.428) 0:00:22.236 ****** 11762 1726853271.80569: entering _queue_task() for managed_node2/set_fact 11762 1726853271.80950: worker is 1 (out of 1 available) 11762 1726853271.80967: exiting _queue_task() for managed_node2/set_fact 11762 1726853271.81086: done queuing things up, now waiting for results queue to drain 11762 1726853271.81089: waiting for pending results... 11762 1726853271.81305: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 11762 1726853271.81494: in run() - task 02083763-bbaf-d845-03d0-00000000055a 11762 1726853271.81502: variable 'ansible_search_path' from source: unknown 11762 1726853271.81506: variable 'ansible_search_path' from source: unknown 11762 1726853271.81542: calling self._execute() 11762 1726853271.81632: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.81636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.81639: variable 'omit' from source: magic vars 11762 1726853271.82015: variable 'ansible_distribution_major_version' from source: facts 11762 1726853271.82018: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853271.82145: variable 'profile_stat' from source: set_fact 11762 1726853271.82156: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853271.82160: when evaluation is False, skipping this task 11762 1726853271.82162: _execute() done 11762 1726853271.82165: dumping result to json 11762 1726853271.82167: done dumping result, returning 11762 1726853271.82170: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-d845-03d0-00000000055a] 11762 1726853271.82175: sending task result for task 02083763-bbaf-d845-03d0-00000000055a 11762 1726853271.82330: done sending task result for task 02083763-bbaf-d845-03d0-00000000055a 11762 1726853271.82334: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853271.82399: no more pending results, returning what we have 11762 1726853271.82402: results queue empty 11762 1726853271.82403: checking for any_errors_fatal 11762 1726853271.82409: done checking for any_errors_fatal 11762 1726853271.82409: checking for max_fail_percentage 11762 1726853271.82411: done checking for max_fail_percentage 11762 1726853271.82412: checking to see if all hosts have failed and the running result is not ok 11762 1726853271.82412: done checking to see if all hosts have failed 11762 1726853271.82413: getting the remaining hosts for this loop 11762 1726853271.82414: done getting the remaining hosts for this loop 11762 1726853271.82417: getting the next task for host managed_node2 11762 1726853271.82423: done getting next task for host managed_node2 11762 1726853271.82426: ^ task is: TASK: Get NM profile info 11762 1726853271.82432: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853271.82434: getting variables 11762 1726853271.82435: in VariableManager get_vars() 11762 1726853271.82465: Calling all_inventory to load vars for managed_node2 11762 1726853271.82467: Calling groups_inventory to load vars for managed_node2 11762 1726853271.82470: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853271.82481: Calling all_plugins_play to load vars for managed_node2 11762 1726853271.82483: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853271.82486: Calling groups_plugins_play to load vars for managed_node2 11762 1726853271.83521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853271.85011: done with get_vars() 11762 1726853271.85037: done getting variables 11762 1726853271.85099: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:27:51 -0400 (0:00:00.045) 0:00:22.281 ****** 11762 1726853271.85133: entering _queue_task() for managed_node2/shell 11762 1726853271.85534: worker is 1 (out of 1 available) 11762 1726853271.85565: exiting _queue_task() for managed_node2/shell 11762 1726853271.85582: done queuing things up, now waiting for results queue to drain 11762 1726853271.85584: waiting for pending results... 11762 1726853271.85767: running TaskExecutor() for managed_node2/TASK: Get NM profile info 11762 1726853271.85860: in run() - task 02083763-bbaf-d845-03d0-00000000055b 11762 1726853271.85879: variable 'ansible_search_path' from source: unknown 11762 1726853271.85883: variable 'ansible_search_path' from source: unknown 11762 1726853271.85913: calling self._execute() 11762 1726853271.85988: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.85993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.86001: variable 'omit' from source: magic vars 11762 1726853271.86294: variable 'ansible_distribution_major_version' from source: facts 11762 1726853271.86305: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853271.86312: variable 'omit' from source: magic vars 11762 1726853271.86357: variable 'omit' from source: magic vars 11762 1726853271.86428: variable 'profile' from source: include params 11762 1726853271.86432: variable 'bond_port_profile' from source: include params 11762 1726853271.86481: variable 'bond_port_profile' from source: include params 11762 1726853271.86498: variable 'omit' from source: magic vars 11762 1726853271.86534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853271.86566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853271.86584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853271.86598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853271.86608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853271.86635: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853271.86639: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.86641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.86712: Set connection var ansible_timeout to 10 11762 1726853271.86715: Set connection var ansible_shell_type to sh 11762 1726853271.86720: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853271.86725: Set connection var ansible_shell_executable to /bin/sh 11762 1726853271.86733: Set connection var ansible_pipelining to False 11762 1726853271.86740: Set connection var ansible_connection to ssh 11762 1726853271.86759: variable 'ansible_shell_executable' from source: unknown 11762 1726853271.86763: variable 'ansible_connection' from source: unknown 11762 1726853271.86765: variable 'ansible_module_compression' from source: unknown 11762 1726853271.86767: variable 'ansible_shell_type' from source: unknown 11762 1726853271.86769: variable 'ansible_shell_executable' from source: unknown 11762 1726853271.86773: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853271.86776: variable 'ansible_pipelining' from source: unknown 11762 1726853271.86778: variable 'ansible_timeout' from source: unknown 11762 1726853271.86784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853271.86889: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853271.86898: variable 'omit' from source: magic vars 11762 1726853271.86903: starting attempt loop 11762 1726853271.86906: running the handler 11762 1726853271.86915: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853271.86934: _low_level_execute_command(): starting 11762 1726853271.86940: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853271.87565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853271.87569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.87575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853271.87577: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.87659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.87728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853271.89467: stdout chunk (state=3): >>>/root <<< 11762 1726853271.89588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853271.89592: stdout chunk (state=3): >>><<< 11762 1726853271.89601: stderr chunk (state=3): >>><<< 11762 1726853271.89638: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853271.89641: _low_level_execute_command(): starting 11762 1726853271.89648: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815 `" && echo ansible-tmp-1726853271.8962336-12771-177249585351815="` echo /root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815 `" ) && sleep 0' 11762 1726853271.90065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853271.90083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853271.90109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.90113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853271.90117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.90169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853271.90178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853271.90182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.90248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853271.92273: stdout chunk (state=3): >>>ansible-tmp-1726853271.8962336-12771-177249585351815=/root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815 <<< 11762 1726853271.92401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853271.92423: stderr chunk (state=3): >>><<< 11762 1726853271.92427: stdout chunk (state=3): >>><<< 11762 1726853271.92446: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853271.8962336-12771-177249585351815=/root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853271.92477: variable 'ansible_module_compression' from source: unknown 11762 1726853271.92517: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853271.92549: variable 'ansible_facts' from source: unknown 11762 1726853271.92610: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815/AnsiballZ_command.py 11762 1726853271.92716: Sending initial data 11762 1726853271.92719: Sent initial data (156 bytes) 11762 1726853271.93157: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853271.93160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.93163: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853271.93165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.93205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853271.93209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.93290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853271.94953: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853271.95023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853271.95140: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp84ryb510 /root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815/AnsiballZ_command.py <<< 11762 1726853271.95146: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815/AnsiballZ_command.py" <<< 11762 1726853271.95235: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp84ryb510" to remote "/root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815/AnsiballZ_command.py" <<< 11762 1726853271.95239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815/AnsiballZ_command.py" <<< 11762 1726853271.95883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853271.95924: stderr chunk (state=3): >>><<< 11762 1726853271.95929: stdout chunk (state=3): >>><<< 11762 1726853271.95964: done transferring module to remote 11762 1726853271.95974: _low_level_execute_command(): starting 11762 1726853271.95980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815/ /root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815/AnsiballZ_command.py && sleep 0' 11762 1726853271.96442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853271.96446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853271.96452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.96454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853271.96456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.96514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853271.96518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.96585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853271.98469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853271.98495: stderr chunk (state=3): >>><<< 11762 1726853271.98499: stdout chunk (state=3): >>><<< 11762 1726853271.98514: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853271.98517: _low_level_execute_command(): starting 11762 1726853271.98520: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815/AnsiballZ_command.py && sleep 0' 11762 1726853271.98929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853271.98963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853271.98966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853271.98970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.98979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853271.98981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853271.99026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853271.99031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853271.99033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853271.99109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853272.17306: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 13:27:52.147752", "end": "2024-09-20 13:27:52.170982", "delta": "0:00:00.023230", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853272.18965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853272.18989: stderr chunk (state=3): >>><<< 11762 1726853272.18992: stdout chunk (state=3): >>><<< 11762 1726853272.19012: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 13:27:52.147752", "end": "2024-09-20 13:27:52.170982", "delta": "0:00:00.023230", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853272.19040: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853272.19050: _low_level_execute_command(): starting 11762 1726853272.19053: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853271.8962336-12771-177249585351815/ > /dev/null 2>&1 && sleep 0' 11762 1726853272.19507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853272.19511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853272.19513: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853272.19515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853272.19565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853272.19568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853272.19576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853272.19705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853272.21600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853272.21624: stderr chunk (state=3): >>><<< 11762 1726853272.21627: stdout chunk (state=3): >>><<< 11762 1726853272.21646: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853272.21652: handler run complete 11762 1726853272.21670: Evaluated conditional (False): False 11762 1726853272.21681: attempt loop complete, returning result 11762 1726853272.21684: _execute() done 11762 1726853272.21687: dumping result to json 11762 1726853272.21691: done dumping result, returning 11762 1726853272.21698: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [02083763-bbaf-d845-03d0-00000000055b] 11762 1726853272.21704: sending task result for task 02083763-bbaf-d845-03d0-00000000055b 11762 1726853272.21798: done sending task result for task 02083763-bbaf-d845-03d0-00000000055b 11762 1726853272.21800: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.023230", "end": "2024-09-20 13:27:52.170982", "rc": 0, "start": "2024-09-20 13:27:52.147752" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11762 1726853272.21870: no more pending results, returning what we have 11762 1726853272.21876: results queue empty 11762 1726853272.21876: checking for any_errors_fatal 11762 1726853272.21883: done checking for any_errors_fatal 11762 1726853272.21884: checking for max_fail_percentage 11762 1726853272.21886: done checking for max_fail_percentage 11762 1726853272.21887: checking to see if all hosts have failed and the running result is not ok 11762 1726853272.21888: done checking to see if all hosts have failed 11762 1726853272.21888: getting the remaining hosts for this loop 11762 1726853272.21890: done getting the remaining hosts for this loop 11762 1726853272.21894: getting the next task for host managed_node2 11762 1726853272.21901: done getting next task for host managed_node2 11762 1726853272.21903: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11762 1726853272.21916: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853272.21920: getting variables 11762 1726853272.21921: in VariableManager get_vars() 11762 1726853272.21955: Calling all_inventory to load vars for managed_node2 11762 1726853272.21958: Calling groups_inventory to load vars for managed_node2 11762 1726853272.21961: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.21974: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.21977: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.21979: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.22781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.23656: done with get_vars() 11762 1726853272.23675: done getting variables 11762 1726853272.23719: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:27:52 -0400 (0:00:00.386) 0:00:22.667 ****** 11762 1726853272.23748: entering _queue_task() for managed_node2/set_fact 11762 1726853272.23995: worker is 1 (out of 1 available) 11762 1726853272.24010: exiting _queue_task() for managed_node2/set_fact 11762 1726853272.24023: done queuing things up, now waiting for results queue to drain 11762 1726853272.24025: waiting for pending results... 11762 1726853272.24209: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11762 1726853272.24293: in run() - task 02083763-bbaf-d845-03d0-00000000055c 11762 1726853272.24305: variable 'ansible_search_path' from source: unknown 11762 1726853272.24308: variable 'ansible_search_path' from source: unknown 11762 1726853272.24338: calling self._execute() 11762 1726853272.24408: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.24412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.24420: variable 'omit' from source: magic vars 11762 1726853272.24707: variable 'ansible_distribution_major_version' from source: facts 11762 1726853272.24716: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853272.24810: variable 'nm_profile_exists' from source: set_fact 11762 1726853272.24821: Evaluated conditional (nm_profile_exists.rc == 0): True 11762 1726853272.24827: variable 'omit' from source: magic vars 11762 1726853272.24866: variable 'omit' from source: magic vars 11762 1726853272.24890: variable 'omit' from source: magic vars 11762 1726853272.24925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853272.24956: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853272.24974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853272.24987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853272.24997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853272.25024: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853272.25027: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.25030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.25097: Set connection var ansible_timeout to 10 11762 1726853272.25100: Set connection var ansible_shell_type to sh 11762 1726853272.25105: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853272.25110: Set connection var ansible_shell_executable to /bin/sh 11762 1726853272.25118: Set connection var ansible_pipelining to False 11762 1726853272.25127: Set connection var ansible_connection to ssh 11762 1726853272.25145: variable 'ansible_shell_executable' from source: unknown 11762 1726853272.25149: variable 'ansible_connection' from source: unknown 11762 1726853272.25151: variable 'ansible_module_compression' from source: unknown 11762 1726853272.25154: variable 'ansible_shell_type' from source: unknown 11762 1726853272.25156: variable 'ansible_shell_executable' from source: unknown 11762 1726853272.25158: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.25160: variable 'ansible_pipelining' from source: unknown 11762 1726853272.25162: variable 'ansible_timeout' from source: unknown 11762 1726853272.25165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.25270: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853272.25281: variable 'omit' from source: magic vars 11762 1726853272.25286: starting attempt loop 11762 1726853272.25289: running the handler 11762 1726853272.25299: handler run complete 11762 1726853272.25307: attempt loop complete, returning result 11762 1726853272.25310: _execute() done 11762 1726853272.25313: dumping result to json 11762 1726853272.25315: done dumping result, returning 11762 1726853272.25322: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-d845-03d0-00000000055c] 11762 1726853272.25327: sending task result for task 02083763-bbaf-d845-03d0-00000000055c 11762 1726853272.25412: done sending task result for task 02083763-bbaf-d845-03d0-00000000055c 11762 1726853272.25415: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11762 1726853272.25493: no more pending results, returning what we have 11762 1726853272.25497: results queue empty 11762 1726853272.25498: checking for any_errors_fatal 11762 1726853272.25505: done checking for any_errors_fatal 11762 1726853272.25506: checking for max_fail_percentage 11762 1726853272.25508: done checking for max_fail_percentage 11762 1726853272.25509: checking to see if all hosts have failed and the running result is not ok 11762 1726853272.25509: done checking to see if all hosts have failed 11762 1726853272.25510: getting the remaining hosts for this loop 11762 1726853272.25512: done getting the remaining hosts for this loop 11762 1726853272.25515: getting the next task for host managed_node2 11762 1726853272.25523: done getting next task for host managed_node2 11762 1726853272.25525: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11762 1726853272.25531: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853272.25534: getting variables 11762 1726853272.25535: in VariableManager get_vars() 11762 1726853272.25563: Calling all_inventory to load vars for managed_node2 11762 1726853272.25565: Calling groups_inventory to load vars for managed_node2 11762 1726853272.25568: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.25578: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.25581: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.25583: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.26447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.27312: done with get_vars() 11762 1726853272.27333: done getting variables 11762 1726853272.27379: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853272.27466: variable 'profile' from source: include params 11762 1726853272.27469: variable 'bond_port_profile' from source: include params 11762 1726853272.27513: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:27:52 -0400 (0:00:00.037) 0:00:22.705 ****** 11762 1726853272.27537: entering _queue_task() for managed_node2/command 11762 1726853272.27783: worker is 1 (out of 1 available) 11762 1726853272.27797: exiting _queue_task() for managed_node2/command 11762 1726853272.27809: done queuing things up, now waiting for results queue to drain 11762 1726853272.27810: waiting for pending results... 11762 1726853272.27998: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11762 1726853272.28079: in run() - task 02083763-bbaf-d845-03d0-00000000055e 11762 1726853272.28091: variable 'ansible_search_path' from source: unknown 11762 1726853272.28095: variable 'ansible_search_path' from source: unknown 11762 1726853272.28123: calling self._execute() 11762 1726853272.28193: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.28198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.28206: variable 'omit' from source: magic vars 11762 1726853272.28473: variable 'ansible_distribution_major_version' from source: facts 11762 1726853272.28484: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853272.28564: variable 'profile_stat' from source: set_fact 11762 1726853272.28574: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853272.28577: when evaluation is False, skipping this task 11762 1726853272.28580: _execute() done 11762 1726853272.28584: dumping result to json 11762 1726853272.28587: done dumping result, returning 11762 1726853272.28599: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [02083763-bbaf-d845-03d0-00000000055e] 11762 1726853272.28602: sending task result for task 02083763-bbaf-d845-03d0-00000000055e 11762 1726853272.28683: done sending task result for task 02083763-bbaf-d845-03d0-00000000055e 11762 1726853272.28685: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853272.28751: no more pending results, returning what we have 11762 1726853272.28755: results queue empty 11762 1726853272.28756: checking for any_errors_fatal 11762 1726853272.28762: done checking for any_errors_fatal 11762 1726853272.28763: checking for max_fail_percentage 11762 1726853272.28765: done checking for max_fail_percentage 11762 1726853272.28766: checking to see if all hosts have failed and the running result is not ok 11762 1726853272.28766: done checking to see if all hosts have failed 11762 1726853272.28767: getting the remaining hosts for this loop 11762 1726853272.28769: done getting the remaining hosts for this loop 11762 1726853272.28773: getting the next task for host managed_node2 11762 1726853272.28781: done getting next task for host managed_node2 11762 1726853272.28783: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11762 1726853272.28788: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853272.28792: getting variables 11762 1726853272.28793: in VariableManager get_vars() 11762 1726853272.28828: Calling all_inventory to load vars for managed_node2 11762 1726853272.28830: Calling groups_inventory to load vars for managed_node2 11762 1726853272.28833: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.28845: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.28848: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.28850: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.29624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.30587: done with get_vars() 11762 1726853272.30602: done getting variables 11762 1726853272.30648: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853272.30729: variable 'profile' from source: include params 11762 1726853272.30731: variable 'bond_port_profile' from source: include params 11762 1726853272.30776: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:27:52 -0400 (0:00:00.032) 0:00:22.738 ****** 11762 1726853272.30800: entering _queue_task() for managed_node2/set_fact 11762 1726853272.31039: worker is 1 (out of 1 available) 11762 1726853272.31056: exiting _queue_task() for managed_node2/set_fact 11762 1726853272.31070: done queuing things up, now waiting for results queue to drain 11762 1726853272.31074: waiting for pending results... 11762 1726853272.31251: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11762 1726853272.31335: in run() - task 02083763-bbaf-d845-03d0-00000000055f 11762 1726853272.31349: variable 'ansible_search_path' from source: unknown 11762 1726853272.31352: variable 'ansible_search_path' from source: unknown 11762 1726853272.31380: calling self._execute() 11762 1726853272.31449: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.31453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.31461: variable 'omit' from source: magic vars 11762 1726853272.31721: variable 'ansible_distribution_major_version' from source: facts 11762 1726853272.31735: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853272.31885: variable 'profile_stat' from source: set_fact 11762 1726853272.31903: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853272.31906: when evaluation is False, skipping this task 11762 1726853272.31909: _execute() done 11762 1726853272.31912: dumping result to json 11762 1726853272.31914: done dumping result, returning 11762 1726853272.31917: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [02083763-bbaf-d845-03d0-00000000055f] 11762 1726853272.31919: sending task result for task 02083763-bbaf-d845-03d0-00000000055f 11762 1726853272.32006: done sending task result for task 02083763-bbaf-d845-03d0-00000000055f 11762 1726853272.32009: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853272.32056: no more pending results, returning what we have 11762 1726853272.32061: results queue empty 11762 1726853272.32061: checking for any_errors_fatal 11762 1726853272.32068: done checking for any_errors_fatal 11762 1726853272.32068: checking for max_fail_percentage 11762 1726853272.32072: done checking for max_fail_percentage 11762 1726853272.32073: checking to see if all hosts have failed and the running result is not ok 11762 1726853272.32074: done checking to see if all hosts have failed 11762 1726853272.32074: getting the remaining hosts for this loop 11762 1726853272.32076: done getting the remaining hosts for this loop 11762 1726853272.32079: getting the next task for host managed_node2 11762 1726853272.32087: done getting next task for host managed_node2 11762 1726853272.32092: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11762 1726853272.32096: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853272.32100: getting variables 11762 1726853272.32101: in VariableManager get_vars() 11762 1726853272.32129: Calling all_inventory to load vars for managed_node2 11762 1726853272.32131: Calling groups_inventory to load vars for managed_node2 11762 1726853272.32133: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.32145: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.32148: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.32151: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.33532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.34764: done with get_vars() 11762 1726853272.34785: done getting variables 11762 1726853272.34832: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853272.34945: variable 'profile' from source: include params 11762 1726853272.34949: variable 'bond_port_profile' from source: include params 11762 1726853272.35064: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:27:52 -0400 (0:00:00.042) 0:00:22.781 ****** 11762 1726853272.35101: entering _queue_task() for managed_node2/command 11762 1726853272.35409: worker is 1 (out of 1 available) 11762 1726853272.35424: exiting _queue_task() for managed_node2/command 11762 1726853272.35439: done queuing things up, now waiting for results queue to drain 11762 1726853272.35441: waiting for pending results... 11762 1726853272.35621: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 11762 1726853272.35711: in run() - task 02083763-bbaf-d845-03d0-000000000560 11762 1726853272.35724: variable 'ansible_search_path' from source: unknown 11762 1726853272.35727: variable 'ansible_search_path' from source: unknown 11762 1726853272.35756: calling self._execute() 11762 1726853272.35868: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.35874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.35877: variable 'omit' from source: magic vars 11762 1726853272.36582: variable 'ansible_distribution_major_version' from source: facts 11762 1726853272.36586: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853272.36608: variable 'profile_stat' from source: set_fact 11762 1726853272.36618: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853272.36622: when evaluation is False, skipping this task 11762 1726853272.36625: _execute() done 11762 1726853272.36628: dumping result to json 11762 1726853272.36630: done dumping result, returning 11762 1726853272.36639: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 [02083763-bbaf-d845-03d0-000000000560] 11762 1726853272.36647: sending task result for task 02083763-bbaf-d845-03d0-000000000560 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853272.36817: no more pending results, returning what we have 11762 1726853272.36822: results queue empty 11762 1726853272.36823: checking for any_errors_fatal 11762 1726853272.36830: done checking for any_errors_fatal 11762 1726853272.36831: checking for max_fail_percentage 11762 1726853272.36833: done checking for max_fail_percentage 11762 1726853272.36834: checking to see if all hosts have failed and the running result is not ok 11762 1726853272.36835: done checking to see if all hosts have failed 11762 1726853272.36836: getting the remaining hosts for this loop 11762 1726853272.36838: done getting the remaining hosts for this loop 11762 1726853272.36841: getting the next task for host managed_node2 11762 1726853272.36852: done getting next task for host managed_node2 11762 1726853272.36855: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11762 1726853272.36861: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853272.36865: getting variables 11762 1726853272.36866: in VariableManager get_vars() 11762 1726853272.36903: Calling all_inventory to load vars for managed_node2 11762 1726853272.36906: Calling groups_inventory to load vars for managed_node2 11762 1726853272.36909: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.36921: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.36924: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.36926: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.37485: done sending task result for task 02083763-bbaf-d845-03d0-000000000560 11762 1726853272.37488: WORKER PROCESS EXITING 11762 1726853272.38631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.39893: done with get_vars() 11762 1726853272.39917: done getting variables 11762 1726853272.39980: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853272.40096: variable 'profile' from source: include params 11762 1726853272.40101: variable 'bond_port_profile' from source: include params 11762 1726853272.40162: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:27:52 -0400 (0:00:00.050) 0:00:22.832 ****** 11762 1726853272.40196: entering _queue_task() for managed_node2/set_fact 11762 1726853272.40528: worker is 1 (out of 1 available) 11762 1726853272.40546: exiting _queue_task() for managed_node2/set_fact 11762 1726853272.40558: done queuing things up, now waiting for results queue to drain 11762 1726853272.40560: waiting for pending results... 11762 1726853272.40859: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11762 1726853272.41480: in run() - task 02083763-bbaf-d845-03d0-000000000561 11762 1726853272.41484: variable 'ansible_search_path' from source: unknown 11762 1726853272.41487: variable 'ansible_search_path' from source: unknown 11762 1726853272.41490: calling self._execute() 11762 1726853272.41492: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.41494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.41496: variable 'omit' from source: magic vars 11762 1726853272.42266: variable 'ansible_distribution_major_version' from source: facts 11762 1726853272.42286: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853272.42415: variable 'profile_stat' from source: set_fact 11762 1726853272.42539: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853272.42545: when evaluation is False, skipping this task 11762 1726853272.42552: _execute() done 11762 1726853272.42556: dumping result to json 11762 1726853272.42567: done dumping result, returning 11762 1726853272.42579: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [02083763-bbaf-d845-03d0-000000000561] 11762 1726853272.42584: sending task result for task 02083763-bbaf-d845-03d0-000000000561 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853272.42746: no more pending results, returning what we have 11762 1726853272.42751: results queue empty 11762 1726853272.42752: checking for any_errors_fatal 11762 1726853272.42758: done checking for any_errors_fatal 11762 1726853272.42758: checking for max_fail_percentage 11762 1726853272.42760: done checking for max_fail_percentage 11762 1726853272.42761: checking to see if all hosts have failed and the running result is not ok 11762 1726853272.42762: done checking to see if all hosts have failed 11762 1726853272.42763: getting the remaining hosts for this loop 11762 1726853272.42765: done getting the remaining hosts for this loop 11762 1726853272.42769: getting the next task for host managed_node2 11762 1726853272.42781: done getting next task for host managed_node2 11762 1726853272.42784: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11762 1726853272.42789: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853272.42794: getting variables 11762 1726853272.42795: in VariableManager get_vars() 11762 1726853272.42827: Calling all_inventory to load vars for managed_node2 11762 1726853272.42830: Calling groups_inventory to load vars for managed_node2 11762 1726853272.42833: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.42847: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.42850: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.42853: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.43685: done sending task result for task 02083763-bbaf-d845-03d0-000000000561 11762 1726853272.43689: WORKER PROCESS EXITING 11762 1726853272.45396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.50324: done with get_vars() 11762 1726853272.50358: done getting variables 11762 1726853272.50530: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853272.50797: variable 'profile' from source: include params 11762 1726853272.50801: variable 'bond_port_profile' from source: include params 11762 1726853272.50973: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:27:52 -0400 (0:00:00.109) 0:00:22.941 ****** 11762 1726853272.51122: entering _queue_task() for managed_node2/assert 11762 1726853272.52033: worker is 1 (out of 1 available) 11762 1726853272.52048: exiting _queue_task() for managed_node2/assert 11762 1726853272.52060: done queuing things up, now waiting for results queue to drain 11762 1726853272.52062: waiting for pending results... 11762 1726853272.53289: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.0' 11762 1726853272.53296: in run() - task 02083763-bbaf-d845-03d0-0000000004e1 11762 1726853272.53300: variable 'ansible_search_path' from source: unknown 11762 1726853272.53303: variable 'ansible_search_path' from source: unknown 11762 1726853272.53516: calling self._execute() 11762 1726853272.53608: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.53685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.53959: variable 'omit' from source: magic vars 11762 1726853272.54676: variable 'ansible_distribution_major_version' from source: facts 11762 1726853272.54695: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853272.54707: variable 'omit' from source: magic vars 11762 1726853272.54769: variable 'omit' from source: magic vars 11762 1726853272.54984: variable 'profile' from source: include params 11762 1726853272.55083: variable 'bond_port_profile' from source: include params 11762 1726853272.55150: variable 'bond_port_profile' from source: include params 11762 1726853272.55279: variable 'omit' from source: magic vars 11762 1726853272.55325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853272.55368: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853272.55437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853272.55513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853272.55589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853272.55625: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853272.55649: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.55695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.55820: Set connection var ansible_timeout to 10 11762 1726853272.55858: Set connection var ansible_shell_type to sh 11762 1726853272.55869: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853272.55882: Set connection var ansible_shell_executable to /bin/sh 11762 1726853272.55893: Set connection var ansible_pipelining to False 11762 1726853272.55906: Set connection var ansible_connection to ssh 11762 1726853272.55937: variable 'ansible_shell_executable' from source: unknown 11762 1726853272.55948: variable 'ansible_connection' from source: unknown 11762 1726853272.55955: variable 'ansible_module_compression' from source: unknown 11762 1726853272.55964: variable 'ansible_shell_type' from source: unknown 11762 1726853272.55979: variable 'ansible_shell_executable' from source: unknown 11762 1726853272.55987: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.56076: variable 'ansible_pipelining' from source: unknown 11762 1726853272.56079: variable 'ansible_timeout' from source: unknown 11762 1726853272.56081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.56155: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853272.56174: variable 'omit' from source: magic vars 11762 1726853272.56185: starting attempt loop 11762 1726853272.56192: running the handler 11762 1726853272.56311: variable 'lsr_net_profile_exists' from source: set_fact 11762 1726853272.56321: Evaluated conditional (lsr_net_profile_exists): True 11762 1726853272.56331: handler run complete 11762 1726853272.56352: attempt loop complete, returning result 11762 1726853272.56358: _execute() done 11762 1726853272.56364: dumping result to json 11762 1726853272.56372: done dumping result, returning 11762 1726853272.56384: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.0' [02083763-bbaf-d845-03d0-0000000004e1] 11762 1726853272.56393: sending task result for task 02083763-bbaf-d845-03d0-0000000004e1 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853272.56540: no more pending results, returning what we have 11762 1726853272.56544: results queue empty 11762 1726853272.56545: checking for any_errors_fatal 11762 1726853272.56550: done checking for any_errors_fatal 11762 1726853272.56551: checking for max_fail_percentage 11762 1726853272.56553: done checking for max_fail_percentage 11762 1726853272.56554: checking to see if all hosts have failed and the running result is not ok 11762 1726853272.56554: done checking to see if all hosts have failed 11762 1726853272.56555: getting the remaining hosts for this loop 11762 1726853272.56557: done getting the remaining hosts for this loop 11762 1726853272.56560: getting the next task for host managed_node2 11762 1726853272.56566: done getting next task for host managed_node2 11762 1726853272.56568: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11762 1726853272.56575: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853272.56578: getting variables 11762 1726853272.56580: in VariableManager get_vars() 11762 1726853272.56613: Calling all_inventory to load vars for managed_node2 11762 1726853272.56615: Calling groups_inventory to load vars for managed_node2 11762 1726853272.56618: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.56631: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.56634: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.56637: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.57157: done sending task result for task 02083763-bbaf-d845-03d0-0000000004e1 11762 1726853272.57160: WORKER PROCESS EXITING 11762 1726853272.58997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.62300: done with get_vars() 11762 1726853272.62333: done getting variables 11762 1726853272.62392: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853272.63016: variable 'profile' from source: include params 11762 1726853272.63021: variable 'bond_port_profile' from source: include params 11762 1726853272.63088: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:27:52 -0400 (0:00:00.119) 0:00:23.061 ****** 11762 1726853272.63123: entering _queue_task() for managed_node2/assert 11762 1726853272.64304: worker is 1 (out of 1 available) 11762 1726853272.64317: exiting _queue_task() for managed_node2/assert 11762 1726853272.64333: done queuing things up, now waiting for results queue to drain 11762 1726853272.64334: waiting for pending results... 11762 1726853272.65193: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11762 1726853272.65334: in run() - task 02083763-bbaf-d845-03d0-0000000004e2 11762 1726853272.65339: variable 'ansible_search_path' from source: unknown 11762 1726853272.65343: variable 'ansible_search_path' from source: unknown 11762 1726853272.65531: calling self._execute() 11762 1726853272.65670: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.65677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.65687: variable 'omit' from source: magic vars 11762 1726853272.66742: variable 'ansible_distribution_major_version' from source: facts 11762 1726853272.66746: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853272.66748: variable 'omit' from source: magic vars 11762 1726853272.66846: variable 'omit' from source: magic vars 11762 1726853272.67288: variable 'profile' from source: include params 11762 1726853272.67292: variable 'bond_port_profile' from source: include params 11762 1726853272.67294: variable 'bond_port_profile' from source: include params 11762 1726853272.67397: variable 'omit' from source: magic vars 11762 1726853272.67539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853272.67774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853272.68074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853272.68078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853272.68080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853272.68083: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853272.68086: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.68089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.68273: Set connection var ansible_timeout to 10 11762 1726853272.68410: Set connection var ansible_shell_type to sh 11762 1726853272.68418: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853272.68424: Set connection var ansible_shell_executable to /bin/sh 11762 1726853272.68433: Set connection var ansible_pipelining to False 11762 1726853272.68440: Set connection var ansible_connection to ssh 11762 1726853272.68505: variable 'ansible_shell_executable' from source: unknown 11762 1726853272.68508: variable 'ansible_connection' from source: unknown 11762 1726853272.68682: variable 'ansible_module_compression' from source: unknown 11762 1726853272.68686: variable 'ansible_shell_type' from source: unknown 11762 1726853272.68688: variable 'ansible_shell_executable' from source: unknown 11762 1726853272.68691: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.68696: variable 'ansible_pipelining' from source: unknown 11762 1726853272.68699: variable 'ansible_timeout' from source: unknown 11762 1726853272.68703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.69050: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853272.69054: variable 'omit' from source: magic vars 11762 1726853272.69056: starting attempt loop 11762 1726853272.69059: running the handler 11762 1726853272.70196: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11762 1726853272.70200: Evaluated conditional (lsr_net_profile_ansible_managed): True 11762 1726853272.70202: handler run complete 11762 1726853272.70205: attempt loop complete, returning result 11762 1726853272.70207: _execute() done 11762 1726853272.70209: dumping result to json 11762 1726853272.70211: done dumping result, returning 11762 1726853272.70214: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' [02083763-bbaf-d845-03d0-0000000004e2] 11762 1726853272.70291: sending task result for task 02083763-bbaf-d845-03d0-0000000004e2 11762 1726853272.70386: done sending task result for task 02083763-bbaf-d845-03d0-0000000004e2 11762 1726853272.70573: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853272.70633: no more pending results, returning what we have 11762 1726853272.70638: results queue empty 11762 1726853272.70639: checking for any_errors_fatal 11762 1726853272.70648: done checking for any_errors_fatal 11762 1726853272.70649: checking for max_fail_percentage 11762 1726853272.70651: done checking for max_fail_percentage 11762 1726853272.70653: checking to see if all hosts have failed and the running result is not ok 11762 1726853272.70654: done checking to see if all hosts have failed 11762 1726853272.70654: getting the remaining hosts for this loop 11762 1726853272.70656: done getting the remaining hosts for this loop 11762 1726853272.70660: getting the next task for host managed_node2 11762 1726853272.70669: done getting next task for host managed_node2 11762 1726853272.70675: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11762 1726853272.70680: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853272.70685: getting variables 11762 1726853272.70686: in VariableManager get_vars() 11762 1726853272.70721: Calling all_inventory to load vars for managed_node2 11762 1726853272.70724: Calling groups_inventory to load vars for managed_node2 11762 1726853272.70728: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.70741: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.70744: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.70747: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.72920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.74935: done with get_vars() 11762 1726853272.74961: done getting variables 11762 1726853272.75027: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853272.75257: variable 'profile' from source: include params 11762 1726853272.75262: variable 'bond_port_profile' from source: include params 11762 1726853272.75325: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:27:52 -0400 (0:00:00.122) 0:00:23.184 ****** 11762 1726853272.75360: entering _queue_task() for managed_node2/assert 11762 1726853272.76057: worker is 1 (out of 1 available) 11762 1726853272.76074: exiting _queue_task() for managed_node2/assert 11762 1726853272.76087: done queuing things up, now waiting for results queue to drain 11762 1726853272.76089: waiting for pending results... 11762 1726853272.76753: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.0 11762 1726853272.77179: in run() - task 02083763-bbaf-d845-03d0-0000000004e3 11762 1726853272.77184: variable 'ansible_search_path' from source: unknown 11762 1726853272.77186: variable 'ansible_search_path' from source: unknown 11762 1726853272.77190: calling self._execute() 11762 1726853272.77414: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.77423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.77441: variable 'omit' from source: magic vars 11762 1726853272.77896: variable 'ansible_distribution_major_version' from source: facts 11762 1726853272.77960: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853272.77964: variable 'omit' from source: magic vars 11762 1726853272.77991: variable 'omit' from source: magic vars 11762 1726853272.78102: variable 'profile' from source: include params 11762 1726853272.78114: variable 'bond_port_profile' from source: include params 11762 1726853272.78183: variable 'bond_port_profile' from source: include params 11762 1726853272.78208: variable 'omit' from source: magic vars 11762 1726853272.78255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853272.78395: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853272.78398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853272.78401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853272.78403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853272.78406: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853272.78407: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.78410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.78521: Set connection var ansible_timeout to 10 11762 1726853272.78530: Set connection var ansible_shell_type to sh 11762 1726853272.78541: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853272.78552: Set connection var ansible_shell_executable to /bin/sh 11762 1726853272.78564: Set connection var ansible_pipelining to False 11762 1726853272.78579: Set connection var ansible_connection to ssh 11762 1726853272.78609: variable 'ansible_shell_executable' from source: unknown 11762 1726853272.78621: variable 'ansible_connection' from source: unknown 11762 1726853272.78628: variable 'ansible_module_compression' from source: unknown 11762 1726853272.78635: variable 'ansible_shell_type' from source: unknown 11762 1726853272.78641: variable 'ansible_shell_executable' from source: unknown 11762 1726853272.78647: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.78655: variable 'ansible_pipelining' from source: unknown 11762 1726853272.78661: variable 'ansible_timeout' from source: unknown 11762 1726853272.78668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.78811: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853272.78976: variable 'omit' from source: magic vars 11762 1726853272.78979: starting attempt loop 11762 1726853272.78982: running the handler 11762 1726853272.78984: variable 'lsr_net_profile_fingerprint' from source: set_fact 11762 1726853272.78986: Evaluated conditional (lsr_net_profile_fingerprint): True 11762 1726853272.78988: handler run complete 11762 1726853272.78991: attempt loop complete, returning result 11762 1726853272.79000: _execute() done 11762 1726853272.79007: dumping result to json 11762 1726853272.79015: done dumping result, returning 11762 1726853272.79027: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.0 [02083763-bbaf-d845-03d0-0000000004e3] 11762 1726853272.79036: sending task result for task 02083763-bbaf-d845-03d0-0000000004e3 11762 1726853272.79311: done sending task result for task 02083763-bbaf-d845-03d0-0000000004e3 11762 1726853272.79314: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853272.79363: no more pending results, returning what we have 11762 1726853272.79367: results queue empty 11762 1726853272.79368: checking for any_errors_fatal 11762 1726853272.79376: done checking for any_errors_fatal 11762 1726853272.79377: checking for max_fail_percentage 11762 1726853272.79378: done checking for max_fail_percentage 11762 1726853272.79380: checking to see if all hosts have failed and the running result is not ok 11762 1726853272.79380: done checking to see if all hosts have failed 11762 1726853272.79381: getting the remaining hosts for this loop 11762 1726853272.79383: done getting the remaining hosts for this loop 11762 1726853272.79386: getting the next task for host managed_node2 11762 1726853272.79398: done getting next task for host managed_node2 11762 1726853272.79401: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11762 1726853272.79406: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853272.79411: getting variables 11762 1726853272.79412: in VariableManager get_vars() 11762 1726853272.79445: Calling all_inventory to load vars for managed_node2 11762 1726853272.79448: Calling groups_inventory to load vars for managed_node2 11762 1726853272.79452: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.79463: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.79466: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.79468: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.81335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.82954: done with get_vars() 11762 1726853272.82990: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:27:52 -0400 (0:00:00.077) 0:00:23.261 ****** 11762 1726853272.83104: entering _queue_task() for managed_node2/include_tasks 11762 1726853272.84009: worker is 1 (out of 1 available) 11762 1726853272.84020: exiting _queue_task() for managed_node2/include_tasks 11762 1726853272.84032: done queuing things up, now waiting for results queue to drain 11762 1726853272.84034: waiting for pending results... 11762 1726853272.84494: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 11762 1726853272.84669: in run() - task 02083763-bbaf-d845-03d0-0000000004e7 11762 1726853272.84715: variable 'ansible_search_path' from source: unknown 11762 1726853272.84805: variable 'ansible_search_path' from source: unknown 11762 1726853272.84829: calling self._execute() 11762 1726853272.85132: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.85136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.85140: variable 'omit' from source: magic vars 11762 1726853272.85813: variable 'ansible_distribution_major_version' from source: facts 11762 1726853272.85834: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853272.85848: _execute() done 11762 1726853272.85857: dumping result to json 11762 1726853272.85865: done dumping result, returning 11762 1726853272.85877: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-d845-03d0-0000000004e7] 11762 1726853272.85887: sending task result for task 02083763-bbaf-d845-03d0-0000000004e7 11762 1726853272.86181: done sending task result for task 02083763-bbaf-d845-03d0-0000000004e7 11762 1726853272.86185: WORKER PROCESS EXITING 11762 1726853272.86215: no more pending results, returning what we have 11762 1726853272.86222: in VariableManager get_vars() 11762 1726853272.86269: Calling all_inventory to load vars for managed_node2 11762 1726853272.86275: Calling groups_inventory to load vars for managed_node2 11762 1726853272.86279: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.86294: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.86298: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.86302: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.88508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.90554: done with get_vars() 11762 1726853272.90583: variable 'ansible_search_path' from source: unknown 11762 1726853272.90585: variable 'ansible_search_path' from source: unknown 11762 1726853272.90623: we have included files to process 11762 1726853272.90625: generating all_blocks data 11762 1726853272.90627: done generating all_blocks data 11762 1726853272.90631: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11762 1726853272.90632: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11762 1726853272.90634: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11762 1726853272.91561: done processing included file 11762 1726853272.91564: iterating over new_blocks loaded from include file 11762 1726853272.91566: in VariableManager get_vars() 11762 1726853272.91585: done with get_vars() 11762 1726853272.91587: filtering new block on tags 11762 1726853272.91703: done filtering new block on tags 11762 1726853272.91707: in VariableManager get_vars() 11762 1726853272.91723: done with get_vars() 11762 1726853272.91724: filtering new block on tags 11762 1726853272.91813: done filtering new block on tags 11762 1726853272.91816: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 11762 1726853272.91822: extending task lists for all hosts with included blocks 11762 1726853272.92432: done extending task lists 11762 1726853272.92434: done processing included files 11762 1726853272.92434: results queue empty 11762 1726853272.92435: checking for any_errors_fatal 11762 1726853272.92457: done checking for any_errors_fatal 11762 1726853272.92458: checking for max_fail_percentage 11762 1726853272.92459: done checking for max_fail_percentage 11762 1726853272.92460: checking to see if all hosts have failed and the running result is not ok 11762 1726853272.92460: done checking to see if all hosts have failed 11762 1726853272.92461: getting the remaining hosts for this loop 11762 1726853272.92462: done getting the remaining hosts for this loop 11762 1726853272.92483: getting the next task for host managed_node2 11762 1726853272.92489: done getting next task for host managed_node2 11762 1726853272.92492: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11762 1726853272.92495: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853272.92498: getting variables 11762 1726853272.92499: in VariableManager get_vars() 11762 1726853272.92508: Calling all_inventory to load vars for managed_node2 11762 1726853272.92510: Calling groups_inventory to load vars for managed_node2 11762 1726853272.92512: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853272.92554: Calling all_plugins_play to load vars for managed_node2 11762 1726853272.92557: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853272.92561: Calling groups_plugins_play to load vars for managed_node2 11762 1726853272.94395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853272.97302: done with get_vars() 11762 1726853272.97336: done getting variables 11762 1726853272.97387: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:27:52 -0400 (0:00:00.143) 0:00:23.404 ****** 11762 1726853272.97423: entering _queue_task() for managed_node2/set_fact 11762 1726853272.97896: worker is 1 (out of 1 available) 11762 1726853272.97907: exiting _queue_task() for managed_node2/set_fact 11762 1726853272.97918: done queuing things up, now waiting for results queue to drain 11762 1726853272.97919: waiting for pending results... 11762 1726853272.98108: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 11762 1726853272.98247: in run() - task 02083763-bbaf-d845-03d0-0000000005b4 11762 1726853272.98312: variable 'ansible_search_path' from source: unknown 11762 1726853272.98316: variable 'ansible_search_path' from source: unknown 11762 1726853272.98318: calling self._execute() 11762 1726853272.98417: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.98430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.98443: variable 'omit' from source: magic vars 11762 1726853272.98845: variable 'ansible_distribution_major_version' from source: facts 11762 1726853272.98867: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853272.98902: variable 'omit' from source: magic vars 11762 1726853272.98952: variable 'omit' from source: magic vars 11762 1726853272.98997: variable 'omit' from source: magic vars 11762 1726853272.99049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853272.99178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853272.99181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853272.99183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853272.99185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853272.99188: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853272.99198: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.99205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.99314: Set connection var ansible_timeout to 10 11762 1726853272.99322: Set connection var ansible_shell_type to sh 11762 1726853272.99332: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853272.99341: Set connection var ansible_shell_executable to /bin/sh 11762 1726853272.99352: Set connection var ansible_pipelining to False 11762 1726853272.99362: Set connection var ansible_connection to ssh 11762 1726853272.99396: variable 'ansible_shell_executable' from source: unknown 11762 1726853272.99504: variable 'ansible_connection' from source: unknown 11762 1726853272.99507: variable 'ansible_module_compression' from source: unknown 11762 1726853272.99509: variable 'ansible_shell_type' from source: unknown 11762 1726853272.99513: variable 'ansible_shell_executable' from source: unknown 11762 1726853272.99515: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853272.99517: variable 'ansible_pipelining' from source: unknown 11762 1726853272.99518: variable 'ansible_timeout' from source: unknown 11762 1726853272.99520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853272.99593: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853272.99616: variable 'omit' from source: magic vars 11762 1726853272.99626: starting attempt loop 11762 1726853272.99640: running the handler 11762 1726853272.99657: handler run complete 11762 1726853272.99670: attempt loop complete, returning result 11762 1726853272.99678: _execute() done 11762 1726853272.99684: dumping result to json 11762 1726853272.99722: done dumping result, returning 11762 1726853272.99729: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-d845-03d0-0000000005b4] 11762 1726853272.99731: sending task result for task 02083763-bbaf-d845-03d0-0000000005b4 11762 1726853273.00052: done sending task result for task 02083763-bbaf-d845-03d0-0000000005b4 11762 1726853273.00057: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11762 1726853273.00115: no more pending results, returning what we have 11762 1726853273.00120: results queue empty 11762 1726853273.00120: checking for any_errors_fatal 11762 1726853273.00122: done checking for any_errors_fatal 11762 1726853273.00123: checking for max_fail_percentage 11762 1726853273.00124: done checking for max_fail_percentage 11762 1726853273.00125: checking to see if all hosts have failed and the running result is not ok 11762 1726853273.00126: done checking to see if all hosts have failed 11762 1726853273.00127: getting the remaining hosts for this loop 11762 1726853273.00129: done getting the remaining hosts for this loop 11762 1726853273.00132: getting the next task for host managed_node2 11762 1726853273.00141: done getting next task for host managed_node2 11762 1726853273.00143: ^ task is: TASK: Stat profile file 11762 1726853273.00149: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853273.00167: getting variables 11762 1726853273.00169: in VariableManager get_vars() 11762 1726853273.00209: Calling all_inventory to load vars for managed_node2 11762 1726853273.00212: Calling groups_inventory to load vars for managed_node2 11762 1726853273.00216: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853273.00332: Calling all_plugins_play to load vars for managed_node2 11762 1726853273.00337: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853273.00341: Calling groups_plugins_play to load vars for managed_node2 11762 1726853273.02919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853273.05451: done with get_vars() 11762 1726853273.05533: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:27:53 -0400 (0:00:00.083) 0:00:23.487 ****** 11762 1726853273.05761: entering _queue_task() for managed_node2/stat 11762 1726853273.06285: worker is 1 (out of 1 available) 11762 1726853273.06305: exiting _queue_task() for managed_node2/stat 11762 1726853273.06318: done queuing things up, now waiting for results queue to drain 11762 1726853273.06320: waiting for pending results... 11762 1726853273.06600: running TaskExecutor() for managed_node2/TASK: Stat profile file 11762 1726853273.06685: in run() - task 02083763-bbaf-d845-03d0-0000000005b5 11762 1726853273.06713: variable 'ansible_search_path' from source: unknown 11762 1726853273.06721: variable 'ansible_search_path' from source: unknown 11762 1726853273.06763: calling self._execute() 11762 1726853273.06884: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853273.06910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853273.06913: variable 'omit' from source: magic vars 11762 1726853273.07344: variable 'ansible_distribution_major_version' from source: facts 11762 1726853273.07347: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853273.07349: variable 'omit' from source: magic vars 11762 1726853273.07439: variable 'omit' from source: magic vars 11762 1726853273.07581: variable 'profile' from source: include params 11762 1726853273.07670: variable 'bond_port_profile' from source: include params 11762 1726853273.07675: variable 'bond_port_profile' from source: include params 11762 1726853273.07690: variable 'omit' from source: magic vars 11762 1726853273.07739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853273.07784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853273.07807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853273.07833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853273.07849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853273.07888: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853273.07913: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853273.07924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853273.08034: Set connection var ansible_timeout to 10 11762 1726853273.08075: Set connection var ansible_shell_type to sh 11762 1726853273.08078: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853273.08080: Set connection var ansible_shell_executable to /bin/sh 11762 1726853273.08082: Set connection var ansible_pipelining to False 11762 1726853273.08084: Set connection var ansible_connection to ssh 11762 1726853273.08111: variable 'ansible_shell_executable' from source: unknown 11762 1726853273.08120: variable 'ansible_connection' from source: unknown 11762 1726853273.08126: variable 'ansible_module_compression' from source: unknown 11762 1726853273.08132: variable 'ansible_shell_type' from source: unknown 11762 1726853273.08137: variable 'ansible_shell_executable' from source: unknown 11762 1726853273.08212: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853273.08215: variable 'ansible_pipelining' from source: unknown 11762 1726853273.08220: variable 'ansible_timeout' from source: unknown 11762 1726853273.08222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853273.08388: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853273.08414: variable 'omit' from source: magic vars 11762 1726853273.08428: starting attempt loop 11762 1726853273.08435: running the handler 11762 1726853273.08454: _low_level_execute_command(): starting 11762 1726853273.08466: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853273.09255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853273.09280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853273.09376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853273.09393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853273.09424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853273.09444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853273.09642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.09719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853273.11477: stdout chunk (state=3): >>>/root <<< 11762 1726853273.11629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853273.11633: stdout chunk (state=3): >>><<< 11762 1726853273.11635: stderr chunk (state=3): >>><<< 11762 1726853273.11663: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853273.11765: _low_level_execute_command(): starting 11762 1726853273.11769: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859 `" && echo ansible-tmp-1726853273.116697-12824-142902415376859="` echo /root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859 `" ) && sleep 0' 11762 1726853273.12288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853273.12303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853273.12320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853273.12395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853273.12460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853273.12490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853273.12513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.12612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853273.14982: stdout chunk (state=3): >>>ansible-tmp-1726853273.116697-12824-142902415376859=/root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859 <<< 11762 1726853273.15050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853273.15065: stdout chunk (state=3): >>><<< 11762 1726853273.15081: stderr chunk (state=3): >>><<< 11762 1726853273.15104: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853273.116697-12824-142902415376859=/root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853273.15154: variable 'ansible_module_compression' from source: unknown 11762 1726853273.15225: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11762 1726853273.15276: variable 'ansible_facts' from source: unknown 11762 1726853273.15365: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859/AnsiballZ_stat.py 11762 1726853273.15682: Sending initial data 11762 1726853273.15685: Sent initial data (152 bytes) 11762 1726853273.16129: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853273.16143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853273.16159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853273.16264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853273.16294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.16399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853273.18144: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853273.18227: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853273.18581: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp4lb5jwyx /root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859/AnsiballZ_stat.py <<< 11762 1726853273.18585: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859/AnsiballZ_stat.py" <<< 11762 1726853273.18656: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp4lb5jwyx" to remote "/root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859/AnsiballZ_stat.py" <<< 11762 1726853273.19519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853273.19590: stderr chunk (state=3): >>><<< 11762 1726853273.19602: stdout chunk (state=3): >>><<< 11762 1726853273.19655: done transferring module to remote 11762 1726853273.19673: _low_level_execute_command(): starting 11762 1726853273.19683: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859/ /root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859/AnsiballZ_stat.py && sleep 0' 11762 1726853273.20384: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853273.20456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.20552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853273.22562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853273.22566: stdout chunk (state=3): >>><<< 11762 1726853273.22568: stderr chunk (state=3): >>><<< 11762 1726853273.22731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853273.22737: _low_level_execute_command(): starting 11762 1726853273.22741: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859/AnsiballZ_stat.py && sleep 0' 11762 1726853273.23576: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853273.23580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853273.23582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853273.23594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853273.23622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853273.23636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853273.23667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.23783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853273.40079: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11762 1726853273.41584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853273.41805: stdout chunk (state=3): >>><<< 11762 1726853273.41808: stderr chunk (state=3): >>><<< 11762 1726853273.41811: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853273.41814: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853273.41816: _low_level_execute_command(): starting 11762 1726853273.41819: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853273.116697-12824-142902415376859/ > /dev/null 2>&1 && sleep 0' 11762 1726853273.43162: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853273.43179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853273.43285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853273.43409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.43439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853273.45376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853273.45434: stderr chunk (state=3): >>><<< 11762 1726853273.45437: stdout chunk (state=3): >>><<< 11762 1726853273.45473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853273.45480: handler run complete 11762 1726853273.45503: attempt loop complete, returning result 11762 1726853273.45506: _execute() done 11762 1726853273.45508: dumping result to json 11762 1726853273.45510: done dumping result, returning 11762 1726853273.45520: done running TaskExecutor() for managed_node2/TASK: Stat profile file [02083763-bbaf-d845-03d0-0000000005b5] 11762 1726853273.45525: sending task result for task 02083763-bbaf-d845-03d0-0000000005b5 11762 1726853273.45623: done sending task result for task 02083763-bbaf-d845-03d0-0000000005b5 11762 1726853273.45627: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11762 1726853273.45686: no more pending results, returning what we have 11762 1726853273.45690: results queue empty 11762 1726853273.45691: checking for any_errors_fatal 11762 1726853273.45700: done checking for any_errors_fatal 11762 1726853273.45700: checking for max_fail_percentage 11762 1726853273.45702: done checking for max_fail_percentage 11762 1726853273.45703: checking to see if all hosts have failed and the running result is not ok 11762 1726853273.45704: done checking to see if all hosts have failed 11762 1726853273.45704: getting the remaining hosts for this loop 11762 1726853273.45706: done getting the remaining hosts for this loop 11762 1726853273.45710: getting the next task for host managed_node2 11762 1726853273.45718: done getting next task for host managed_node2 11762 1726853273.45720: ^ task is: TASK: Set NM profile exist flag based on the profile files 11762 1726853273.45725: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853273.45729: getting variables 11762 1726853273.45730: in VariableManager get_vars() 11762 1726853273.45763: Calling all_inventory to load vars for managed_node2 11762 1726853273.45766: Calling groups_inventory to load vars for managed_node2 11762 1726853273.45769: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853273.45781: Calling all_plugins_play to load vars for managed_node2 11762 1726853273.45785: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853273.45787: Calling groups_plugins_play to load vars for managed_node2 11762 1726853273.48353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853273.50340: done with get_vars() 11762 1726853273.50360: done getting variables 11762 1726853273.50427: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:27:53 -0400 (0:00:00.447) 0:00:23.935 ****** 11762 1726853273.50461: entering _queue_task() for managed_node2/set_fact 11762 1726853273.50829: worker is 1 (out of 1 available) 11762 1726853273.50844: exiting _queue_task() for managed_node2/set_fact 11762 1726853273.50866: done queuing things up, now waiting for results queue to drain 11762 1726853273.50868: waiting for pending results... 11762 1726853273.51214: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 11762 1726853273.51378: in run() - task 02083763-bbaf-d845-03d0-0000000005b6 11762 1726853273.51388: variable 'ansible_search_path' from source: unknown 11762 1726853273.51576: variable 'ansible_search_path' from source: unknown 11762 1726853273.51580: calling self._execute() 11762 1726853273.51583: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853273.51586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853273.51588: variable 'omit' from source: magic vars 11762 1726853273.52022: variable 'ansible_distribution_major_version' from source: facts 11762 1726853273.52141: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853273.52188: variable 'profile_stat' from source: set_fact 11762 1726853273.52205: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853273.52213: when evaluation is False, skipping this task 11762 1726853273.52221: _execute() done 11762 1726853273.52229: dumping result to json 11762 1726853273.52236: done dumping result, returning 11762 1726853273.52261: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-d845-03d0-0000000005b6] 11762 1726853273.52276: sending task result for task 02083763-bbaf-d845-03d0-0000000005b6 11762 1726853273.52438: done sending task result for task 02083763-bbaf-d845-03d0-0000000005b6 11762 1726853273.52446: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853273.52502: no more pending results, returning what we have 11762 1726853273.52506: results queue empty 11762 1726853273.52507: checking for any_errors_fatal 11762 1726853273.52515: done checking for any_errors_fatal 11762 1726853273.52516: checking for max_fail_percentage 11762 1726853273.52518: done checking for max_fail_percentage 11762 1726853273.52518: checking to see if all hosts have failed and the running result is not ok 11762 1726853273.52519: done checking to see if all hosts have failed 11762 1726853273.52520: getting the remaining hosts for this loop 11762 1726853273.52521: done getting the remaining hosts for this loop 11762 1726853273.52524: getting the next task for host managed_node2 11762 1726853273.52532: done getting next task for host managed_node2 11762 1726853273.52534: ^ task is: TASK: Get NM profile info 11762 1726853273.52539: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853273.52541: getting variables 11762 1726853273.52543: in VariableManager get_vars() 11762 1726853273.52576: Calling all_inventory to load vars for managed_node2 11762 1726853273.52579: Calling groups_inventory to load vars for managed_node2 11762 1726853273.52583: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853273.52594: Calling all_plugins_play to load vars for managed_node2 11762 1726853273.52597: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853273.52599: Calling groups_plugins_play to load vars for managed_node2 11762 1726853273.59363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853273.61012: done with get_vars() 11762 1726853273.61049: done getting variables 11762 1726853273.61108: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:27:53 -0400 (0:00:00.106) 0:00:24.041 ****** 11762 1726853273.61148: entering _queue_task() for managed_node2/shell 11762 1726853273.61548: worker is 1 (out of 1 available) 11762 1726853273.61683: exiting _queue_task() for managed_node2/shell 11762 1726853273.61694: done queuing things up, now waiting for results queue to drain 11762 1726853273.61697: waiting for pending results... 11762 1726853273.61989: running TaskExecutor() for managed_node2/TASK: Get NM profile info 11762 1726853273.62115: in run() - task 02083763-bbaf-d845-03d0-0000000005b7 11762 1726853273.62140: variable 'ansible_search_path' from source: unknown 11762 1726853273.62152: variable 'ansible_search_path' from source: unknown 11762 1726853273.62195: calling self._execute() 11762 1726853273.62327: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853273.62376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853273.62379: variable 'omit' from source: magic vars 11762 1726853273.62809: variable 'ansible_distribution_major_version' from source: facts 11762 1726853273.62832: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853273.62846: variable 'omit' from source: magic vars 11762 1726853273.62979: variable 'omit' from source: magic vars 11762 1726853273.63054: variable 'profile' from source: include params 11762 1726853273.63064: variable 'bond_port_profile' from source: include params 11762 1726853273.63136: variable 'bond_port_profile' from source: include params 11762 1726853273.63169: variable 'omit' from source: magic vars 11762 1726853273.63219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853273.63303: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853273.63307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853273.63317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853273.63335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853273.63376: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853273.63385: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853273.63392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853273.63518: Set connection var ansible_timeout to 10 11762 1726853273.63522: Set connection var ansible_shell_type to sh 11762 1726853273.63525: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853273.63576: Set connection var ansible_shell_executable to /bin/sh 11762 1726853273.63580: Set connection var ansible_pipelining to False 11762 1726853273.63582: Set connection var ansible_connection to ssh 11762 1726853273.63593: variable 'ansible_shell_executable' from source: unknown 11762 1726853273.63600: variable 'ansible_connection' from source: unknown 11762 1726853273.63606: variable 'ansible_module_compression' from source: unknown 11762 1726853273.63612: variable 'ansible_shell_type' from source: unknown 11762 1726853273.63619: variable 'ansible_shell_executable' from source: unknown 11762 1726853273.63629: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853273.63637: variable 'ansible_pipelining' from source: unknown 11762 1726853273.63735: variable 'ansible_timeout' from source: unknown 11762 1726853273.63739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853273.63809: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853273.63826: variable 'omit' from source: magic vars 11762 1726853273.63839: starting attempt loop 11762 1726853273.63856: running the handler 11762 1726853273.63874: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853273.63900: _low_level_execute_command(): starting 11762 1726853273.63914: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853273.65352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853273.65357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853273.65399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.65555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853273.67351: stdout chunk (state=3): >>>/root <<< 11762 1726853273.67504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853273.67577: stdout chunk (state=3): >>><<< 11762 1726853273.67581: stderr chunk (state=3): >>><<< 11762 1726853273.67605: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853273.67776: _low_level_execute_command(): starting 11762 1726853273.67780: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870 `" && echo ansible-tmp-1726853273.6766253-12844-37973807951870="` echo /root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870 `" ) && sleep 0' 11762 1726853273.68860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853273.68876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853273.69080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.69240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853273.71480: stdout chunk (state=3): >>>ansible-tmp-1726853273.6766253-12844-37973807951870=/root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870 <<< 11762 1726853273.71501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853273.71521: stdout chunk (state=3): >>><<< 11762 1726853273.71524: stderr chunk (state=3): >>><<< 11762 1726853273.71545: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853273.6766253-12844-37973807951870=/root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853273.71647: variable 'ansible_module_compression' from source: unknown 11762 1726853273.71704: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853273.72120: variable 'ansible_facts' from source: unknown 11762 1726853273.72315: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870/AnsiballZ_command.py 11762 1726853273.72405: Sending initial data 11762 1726853273.72455: Sent initial data (155 bytes) 11762 1726853273.73873: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853273.74093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.74199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853273.75894: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853273.76007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853273.76141: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp_jecw9lt /root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870/AnsiballZ_command.py <<< 11762 1726853273.76155: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870/AnsiballZ_command.py" <<< 11762 1726853273.76191: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11762 1726853273.76217: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp_jecw9lt" to remote "/root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870/AnsiballZ_command.py" <<< 11762 1726853273.77622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853273.77726: stderr chunk (state=3): >>><<< 11762 1726853273.77746: stdout chunk (state=3): >>><<< 11762 1726853273.77838: done transferring module to remote 11762 1726853273.77858: _low_level_execute_command(): starting 11762 1726853273.77867: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870/ /root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870/AnsiballZ_command.py && sleep 0' 11762 1726853273.79191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853273.79357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853273.79689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.79711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853273.81776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853273.81784: stdout chunk (state=3): >>><<< 11762 1726853273.81787: stderr chunk (state=3): >>><<< 11762 1726853273.81789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853273.81796: _low_level_execute_command(): starting 11762 1726853273.81799: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870/AnsiballZ_command.py && sleep 0' 11762 1726853273.83336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853273.83400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853273.83435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853273.83460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853273.83652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853274.01407: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 13:27:53.991086", "end": "2024-09-20 13:27:54.011762", "delta": "0:00:00.020676", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853274.02954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853274.02998: stderr chunk (state=3): >>><<< 11762 1726853274.03011: stdout chunk (state=3): >>><<< 11762 1726853274.03034: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 13:27:53.991086", "end": "2024-09-20 13:27:54.011762", "delta": "0:00:00.020676", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853274.03103: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853274.03109: _low_level_execute_command(): starting 11762 1726853274.03127: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853273.6766253-12844-37973807951870/ > /dev/null 2>&1 && sleep 0' 11762 1726853274.03949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853274.03978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853274.03982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853274.04006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853274.04009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853274.04011: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853274.04111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853274.04189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853274.06100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853274.06175: stderr chunk (state=3): >>><<< 11762 1726853274.06181: stdout chunk (state=3): >>><<< 11762 1726853274.06184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853274.06197: handler run complete 11762 1726853274.06217: Evaluated conditional (False): False 11762 1726853274.06221: attempt loop complete, returning result 11762 1726853274.06224: _execute() done 11762 1726853274.06226: dumping result to json 11762 1726853274.06228: done dumping result, returning 11762 1726853274.06236: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [02083763-bbaf-d845-03d0-0000000005b7] 11762 1726853274.06239: sending task result for task 02083763-bbaf-d845-03d0-0000000005b7 11762 1726853274.06357: done sending task result for task 02083763-bbaf-d845-03d0-0000000005b7 11762 1726853274.06360: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.020676", "end": "2024-09-20 13:27:54.011762", "rc": 0, "start": "2024-09-20 13:27:53.991086" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11762 1726853274.06475: no more pending results, returning what we have 11762 1726853274.06479: results queue empty 11762 1726853274.06480: checking for any_errors_fatal 11762 1726853274.06492: done checking for any_errors_fatal 11762 1726853274.06493: checking for max_fail_percentage 11762 1726853274.06495: done checking for max_fail_percentage 11762 1726853274.06496: checking to see if all hosts have failed and the running result is not ok 11762 1726853274.06496: done checking to see if all hosts have failed 11762 1726853274.06497: getting the remaining hosts for this loop 11762 1726853274.06499: done getting the remaining hosts for this loop 11762 1726853274.06507: getting the next task for host managed_node2 11762 1726853274.06515: done getting next task for host managed_node2 11762 1726853274.06518: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11762 1726853274.06525: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853274.06529: getting variables 11762 1726853274.06531: in VariableManager get_vars() 11762 1726853274.06561: Calling all_inventory to load vars for managed_node2 11762 1726853274.06563: Calling groups_inventory to load vars for managed_node2 11762 1726853274.06566: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853274.06690: Calling all_plugins_play to load vars for managed_node2 11762 1726853274.06693: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853274.06696: Calling groups_plugins_play to load vars for managed_node2 11762 1726853274.08257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853274.09391: done with get_vars() 11762 1726853274.09407: done getting variables 11762 1726853274.09453: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:27:54 -0400 (0:00:00.483) 0:00:24.525 ****** 11762 1726853274.09483: entering _queue_task() for managed_node2/set_fact 11762 1726853274.09752: worker is 1 (out of 1 available) 11762 1726853274.09766: exiting _queue_task() for managed_node2/set_fact 11762 1726853274.09780: done queuing things up, now waiting for results queue to drain 11762 1726853274.09781: waiting for pending results... 11762 1726853274.10038: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11762 1726853274.10381: in run() - task 02083763-bbaf-d845-03d0-0000000005b8 11762 1726853274.10384: variable 'ansible_search_path' from source: unknown 11762 1726853274.10388: variable 'ansible_search_path' from source: unknown 11762 1726853274.10390: calling self._execute() 11762 1726853274.10393: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.10395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.10397: variable 'omit' from source: magic vars 11762 1726853274.10850: variable 'ansible_distribution_major_version' from source: facts 11762 1726853274.10868: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853274.11017: variable 'nm_profile_exists' from source: set_fact 11762 1726853274.11038: Evaluated conditional (nm_profile_exists.rc == 0): True 11762 1726853274.11048: variable 'omit' from source: magic vars 11762 1726853274.11175: variable 'omit' from source: magic vars 11762 1726853274.11180: variable 'omit' from source: magic vars 11762 1726853274.11226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853274.11290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853274.11316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853274.11338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853274.11396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853274.11401: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853274.11411: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.11420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.11545: Set connection var ansible_timeout to 10 11762 1726853274.11555: Set connection var ansible_shell_type to sh 11762 1726853274.11565: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853274.11576: Set connection var ansible_shell_executable to /bin/sh 11762 1726853274.11613: Set connection var ansible_pipelining to False 11762 1726853274.11616: Set connection var ansible_connection to ssh 11762 1726853274.11652: variable 'ansible_shell_executable' from source: unknown 11762 1726853274.11655: variable 'ansible_connection' from source: unknown 11762 1726853274.11658: variable 'ansible_module_compression' from source: unknown 11762 1726853274.11692: variable 'ansible_shell_type' from source: unknown 11762 1726853274.11696: variable 'ansible_shell_executable' from source: unknown 11762 1726853274.11699: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.11701: variable 'ansible_pipelining' from source: unknown 11762 1726853274.11704: variable 'ansible_timeout' from source: unknown 11762 1726853274.11706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.11898: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853274.11901: variable 'omit' from source: magic vars 11762 1726853274.11904: starting attempt loop 11762 1726853274.11907: running the handler 11762 1726853274.11909: handler run complete 11762 1726853274.11911: attempt loop complete, returning result 11762 1726853274.11913: _execute() done 11762 1726853274.11915: dumping result to json 11762 1726853274.11917: done dumping result, returning 11762 1726853274.11919: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-d845-03d0-0000000005b8] 11762 1726853274.11921: sending task result for task 02083763-bbaf-d845-03d0-0000000005b8 11762 1726853274.12041: done sending task result for task 02083763-bbaf-d845-03d0-0000000005b8 11762 1726853274.12044: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11762 1726853274.12108: no more pending results, returning what we have 11762 1726853274.12111: results queue empty 11762 1726853274.12113: checking for any_errors_fatal 11762 1726853274.12123: done checking for any_errors_fatal 11762 1726853274.12124: checking for max_fail_percentage 11762 1726853274.12126: done checking for max_fail_percentage 11762 1726853274.12127: checking to see if all hosts have failed and the running result is not ok 11762 1726853274.12128: done checking to see if all hosts have failed 11762 1726853274.12128: getting the remaining hosts for this loop 11762 1726853274.12130: done getting the remaining hosts for this loop 11762 1726853274.12134: getting the next task for host managed_node2 11762 1726853274.12144: done getting next task for host managed_node2 11762 1726853274.12147: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11762 1726853274.12155: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853274.12159: getting variables 11762 1726853274.12160: in VariableManager get_vars() 11762 1726853274.12198: Calling all_inventory to load vars for managed_node2 11762 1726853274.12200: Calling groups_inventory to load vars for managed_node2 11762 1726853274.12203: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853274.12213: Calling all_plugins_play to load vars for managed_node2 11762 1726853274.12216: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853274.12219: Calling groups_plugins_play to load vars for managed_node2 11762 1726853274.13746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853274.14978: done with get_vars() 11762 1726853274.14999: done getting variables 11762 1726853274.15042: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853274.15167: variable 'profile' from source: include params 11762 1726853274.15174: variable 'bond_port_profile' from source: include params 11762 1726853274.15241: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:27:54 -0400 (0:00:00.057) 0:00:24.583 ****** 11762 1726853274.15281: entering _queue_task() for managed_node2/command 11762 1726853274.15676: worker is 1 (out of 1 available) 11762 1726853274.15711: exiting _queue_task() for managed_node2/command 11762 1726853274.15726: done queuing things up, now waiting for results queue to drain 11762 1726853274.15728: waiting for pending results... 11762 1726853274.16094: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11762 1726853274.16114: in run() - task 02083763-bbaf-d845-03d0-0000000005ba 11762 1726853274.16137: variable 'ansible_search_path' from source: unknown 11762 1726853274.16146: variable 'ansible_search_path' from source: unknown 11762 1726853274.16190: calling self._execute() 11762 1726853274.16288: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.16298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.16312: variable 'omit' from source: magic vars 11762 1726853274.16990: variable 'ansible_distribution_major_version' from source: facts 11762 1726853274.16994: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853274.16996: variable 'profile_stat' from source: set_fact 11762 1726853274.16998: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853274.17000: when evaluation is False, skipping this task 11762 1726853274.17002: _execute() done 11762 1726853274.17005: dumping result to json 11762 1726853274.17007: done dumping result, returning 11762 1726853274.17010: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [02083763-bbaf-d845-03d0-0000000005ba] 11762 1726853274.17012: sending task result for task 02083763-bbaf-d845-03d0-0000000005ba skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853274.17308: no more pending results, returning what we have 11762 1726853274.17312: results queue empty 11762 1726853274.17313: checking for any_errors_fatal 11762 1726853274.17319: done checking for any_errors_fatal 11762 1726853274.17319: checking for max_fail_percentage 11762 1726853274.17321: done checking for max_fail_percentage 11762 1726853274.17322: checking to see if all hosts have failed and the running result is not ok 11762 1726853274.17323: done checking to see if all hosts have failed 11762 1726853274.17323: getting the remaining hosts for this loop 11762 1726853274.17325: done getting the remaining hosts for this loop 11762 1726853274.17328: getting the next task for host managed_node2 11762 1726853274.17336: done getting next task for host managed_node2 11762 1726853274.17338: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11762 1726853274.17348: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853274.17383: getting variables 11762 1726853274.17385: in VariableManager get_vars() 11762 1726853274.17426: Calling all_inventory to load vars for managed_node2 11762 1726853274.17430: Calling groups_inventory to load vars for managed_node2 11762 1726853274.17434: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853274.17451: Calling all_plugins_play to load vars for managed_node2 11762 1726853274.17455: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853274.17459: Calling groups_plugins_play to load vars for managed_node2 11762 1726853274.18184: done sending task result for task 02083763-bbaf-d845-03d0-0000000005ba 11762 1726853274.18187: WORKER PROCESS EXITING 11762 1726853274.18756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853274.19614: done with get_vars() 11762 1726853274.19630: done getting variables 11762 1726853274.19681: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853274.19769: variable 'profile' from source: include params 11762 1726853274.19774: variable 'bond_port_profile' from source: include params 11762 1726853274.19815: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:27:54 -0400 (0:00:00.045) 0:00:24.628 ****** 11762 1726853274.19840: entering _queue_task() for managed_node2/set_fact 11762 1726853274.20105: worker is 1 (out of 1 available) 11762 1726853274.20119: exiting _queue_task() for managed_node2/set_fact 11762 1726853274.20133: done queuing things up, now waiting for results queue to drain 11762 1726853274.20135: waiting for pending results... 11762 1726853274.20313: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11762 1726853274.20417: in run() - task 02083763-bbaf-d845-03d0-0000000005bb 11762 1726853274.20428: variable 'ansible_search_path' from source: unknown 11762 1726853274.20431: variable 'ansible_search_path' from source: unknown 11762 1726853274.20462: calling self._execute() 11762 1726853274.20538: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.20541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.20552: variable 'omit' from source: magic vars 11762 1726853274.21077: variable 'ansible_distribution_major_version' from source: facts 11762 1726853274.21080: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853274.21085: variable 'profile_stat' from source: set_fact 11762 1726853274.21088: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853274.21090: when evaluation is False, skipping this task 11762 1726853274.21093: _execute() done 11762 1726853274.21095: dumping result to json 11762 1726853274.21098: done dumping result, returning 11762 1726853274.21101: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [02083763-bbaf-d845-03d0-0000000005bb] 11762 1726853274.21104: sending task result for task 02083763-bbaf-d845-03d0-0000000005bb 11762 1726853274.21175: done sending task result for task 02083763-bbaf-d845-03d0-0000000005bb 11762 1726853274.21178: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853274.21317: no more pending results, returning what we have 11762 1726853274.21321: results queue empty 11762 1726853274.21322: checking for any_errors_fatal 11762 1726853274.21328: done checking for any_errors_fatal 11762 1726853274.21328: checking for max_fail_percentage 11762 1726853274.21330: done checking for max_fail_percentage 11762 1726853274.21331: checking to see if all hosts have failed and the running result is not ok 11762 1726853274.21332: done checking to see if all hosts have failed 11762 1726853274.21332: getting the remaining hosts for this loop 11762 1726853274.21334: done getting the remaining hosts for this loop 11762 1726853274.21337: getting the next task for host managed_node2 11762 1726853274.21346: done getting next task for host managed_node2 11762 1726853274.21349: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11762 1726853274.21355: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853274.21358: getting variables 11762 1726853274.21359: in VariableManager get_vars() 11762 1726853274.21495: Calling all_inventory to load vars for managed_node2 11762 1726853274.21497: Calling groups_inventory to load vars for managed_node2 11762 1726853274.21500: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853274.21508: Calling all_plugins_play to load vars for managed_node2 11762 1726853274.21510: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853274.21513: Calling groups_plugins_play to load vars for managed_node2 11762 1726853274.22960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853274.24694: done with get_vars() 11762 1726853274.24720: done getting variables 11762 1726853274.24793: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853274.24925: variable 'profile' from source: include params 11762 1726853274.24929: variable 'bond_port_profile' from source: include params 11762 1726853274.24993: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:27:54 -0400 (0:00:00.051) 0:00:24.680 ****** 11762 1726853274.25024: entering _queue_task() for managed_node2/command 11762 1726853274.25385: worker is 1 (out of 1 available) 11762 1726853274.25404: exiting _queue_task() for managed_node2/command 11762 1726853274.25415: done queuing things up, now waiting for results queue to drain 11762 1726853274.25417: waiting for pending results... 11762 1726853274.25790: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 11762 1726853274.25836: in run() - task 02083763-bbaf-d845-03d0-0000000005bc 11762 1726853274.25840: variable 'ansible_search_path' from source: unknown 11762 1726853274.25847: variable 'ansible_search_path' from source: unknown 11762 1726853274.25947: calling self._execute() 11762 1726853274.25951: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.25958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.25969: variable 'omit' from source: magic vars 11762 1726853274.26302: variable 'ansible_distribution_major_version' from source: facts 11762 1726853274.26312: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853274.26432: variable 'profile_stat' from source: set_fact 11762 1726853274.26446: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853274.26449: when evaluation is False, skipping this task 11762 1726853274.26452: _execute() done 11762 1726853274.26455: dumping result to json 11762 1726853274.26457: done dumping result, returning 11762 1726853274.26489: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 [02083763-bbaf-d845-03d0-0000000005bc] 11762 1726853274.26577: sending task result for task 02083763-bbaf-d845-03d0-0000000005bc 11762 1726853274.26647: done sending task result for task 02083763-bbaf-d845-03d0-0000000005bc 11762 1726853274.26651: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853274.26703: no more pending results, returning what we have 11762 1726853274.26706: results queue empty 11762 1726853274.26707: checking for any_errors_fatal 11762 1726853274.26712: done checking for any_errors_fatal 11762 1726853274.26713: checking for max_fail_percentage 11762 1726853274.26714: done checking for max_fail_percentage 11762 1726853274.26715: checking to see if all hosts have failed and the running result is not ok 11762 1726853274.26716: done checking to see if all hosts have failed 11762 1726853274.26716: getting the remaining hosts for this loop 11762 1726853274.26718: done getting the remaining hosts for this loop 11762 1726853274.26721: getting the next task for host managed_node2 11762 1726853274.26727: done getting next task for host managed_node2 11762 1726853274.26729: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11762 1726853274.26734: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853274.26738: getting variables 11762 1726853274.26739: in VariableManager get_vars() 11762 1726853274.26768: Calling all_inventory to load vars for managed_node2 11762 1726853274.26770: Calling groups_inventory to load vars for managed_node2 11762 1726853274.26774: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853274.26783: Calling all_plugins_play to load vars for managed_node2 11762 1726853274.26785: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853274.26788: Calling groups_plugins_play to load vars for managed_node2 11762 1726853274.28545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853274.30225: done with get_vars() 11762 1726853274.30253: done getting variables 11762 1726853274.30324: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853274.30456: variable 'profile' from source: include params 11762 1726853274.30460: variable 'bond_port_profile' from source: include params 11762 1726853274.30526: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:27:54 -0400 (0:00:00.055) 0:00:24.736 ****** 11762 1726853274.30564: entering _queue_task() for managed_node2/set_fact 11762 1726853274.30928: worker is 1 (out of 1 available) 11762 1726853274.30947: exiting _queue_task() for managed_node2/set_fact 11762 1726853274.30958: done queuing things up, now waiting for results queue to drain 11762 1726853274.30960: waiting for pending results... 11762 1726853274.31258: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11762 1726853274.31356: in run() - task 02083763-bbaf-d845-03d0-0000000005bd 11762 1726853274.31360: variable 'ansible_search_path' from source: unknown 11762 1726853274.31363: variable 'ansible_search_path' from source: unknown 11762 1726853274.31366: calling self._execute() 11762 1726853274.31413: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.31423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.31432: variable 'omit' from source: magic vars 11762 1726853274.31790: variable 'ansible_distribution_major_version' from source: facts 11762 1726853274.31801: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853274.31920: variable 'profile_stat' from source: set_fact 11762 1726853274.31931: Evaluated conditional (profile_stat.stat.exists): False 11762 1726853274.31935: when evaluation is False, skipping this task 11762 1726853274.31938: _execute() done 11762 1726853274.31941: dumping result to json 11762 1726853274.31948: done dumping result, returning 11762 1726853274.32008: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [02083763-bbaf-d845-03d0-0000000005bd] 11762 1726853274.32012: sending task result for task 02083763-bbaf-d845-03d0-0000000005bd 11762 1726853274.32077: done sending task result for task 02083763-bbaf-d845-03d0-0000000005bd 11762 1726853274.32079: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11762 1726853274.32129: no more pending results, returning what we have 11762 1726853274.32133: results queue empty 11762 1726853274.32134: checking for any_errors_fatal 11762 1726853274.32141: done checking for any_errors_fatal 11762 1726853274.32141: checking for max_fail_percentage 11762 1726853274.32146: done checking for max_fail_percentage 11762 1726853274.32147: checking to see if all hosts have failed and the running result is not ok 11762 1726853274.32148: done checking to see if all hosts have failed 11762 1726853274.32148: getting the remaining hosts for this loop 11762 1726853274.32150: done getting the remaining hosts for this loop 11762 1726853274.32153: getting the next task for host managed_node2 11762 1726853274.32162: done getting next task for host managed_node2 11762 1726853274.32165: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11762 1726853274.32170: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853274.32175: getting variables 11762 1726853274.32176: in VariableManager get_vars() 11762 1726853274.32204: Calling all_inventory to load vars for managed_node2 11762 1726853274.32206: Calling groups_inventory to load vars for managed_node2 11762 1726853274.32209: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853274.32221: Calling all_plugins_play to load vars for managed_node2 11762 1726853274.32224: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853274.32226: Calling groups_plugins_play to load vars for managed_node2 11762 1726853274.33805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853274.35667: done with get_vars() 11762 1726853274.35693: done getting variables 11762 1726853274.35767: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853274.35905: variable 'profile' from source: include params 11762 1726853274.35909: variable 'bond_port_profile' from source: include params 11762 1726853274.35980: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:27:54 -0400 (0:00:00.054) 0:00:24.790 ****** 11762 1726853274.36015: entering _queue_task() for managed_node2/assert 11762 1726853274.36402: worker is 1 (out of 1 available) 11762 1726853274.36416: exiting _queue_task() for managed_node2/assert 11762 1726853274.36428: done queuing things up, now waiting for results queue to drain 11762 1726853274.36429: waiting for pending results... 11762 1726853274.36798: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.1' 11762 1726853274.36977: in run() - task 02083763-bbaf-d845-03d0-0000000004e8 11762 1726853274.36982: variable 'ansible_search_path' from source: unknown 11762 1726853274.36985: variable 'ansible_search_path' from source: unknown 11762 1726853274.36988: calling self._execute() 11762 1726853274.37145: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.37158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.37175: variable 'omit' from source: magic vars 11762 1726853274.37577: variable 'ansible_distribution_major_version' from source: facts 11762 1726853274.37593: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853274.37602: variable 'omit' from source: magic vars 11762 1726853274.37661: variable 'omit' from source: magic vars 11762 1726853274.37872: variable 'profile' from source: include params 11762 1726853274.37877: variable 'bond_port_profile' from source: include params 11762 1726853274.37881: variable 'bond_port_profile' from source: include params 11762 1726853274.37897: variable 'omit' from source: magic vars 11762 1726853274.37949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853274.38005: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853274.38034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853274.38061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853274.38108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853274.38126: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853274.38136: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.38178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.38273: Set connection var ansible_timeout to 10 11762 1726853274.38283: Set connection var ansible_shell_type to sh 11762 1726853274.38294: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853274.38310: Set connection var ansible_shell_executable to /bin/sh 11762 1726853274.38329: Set connection var ansible_pipelining to False 11762 1726853274.38341: Set connection var ansible_connection to ssh 11762 1726853274.38412: variable 'ansible_shell_executable' from source: unknown 11762 1726853274.38416: variable 'ansible_connection' from source: unknown 11762 1726853274.38418: variable 'ansible_module_compression' from source: unknown 11762 1726853274.38421: variable 'ansible_shell_type' from source: unknown 11762 1726853274.38424: variable 'ansible_shell_executable' from source: unknown 11762 1726853274.38433: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.38435: variable 'ansible_pipelining' from source: unknown 11762 1726853274.38438: variable 'ansible_timeout' from source: unknown 11762 1726853274.38440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.38630: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853274.38633: variable 'omit' from source: magic vars 11762 1726853274.38636: starting attempt loop 11762 1726853274.38638: running the handler 11762 1726853274.38762: variable 'lsr_net_profile_exists' from source: set_fact 11762 1726853274.38775: Evaluated conditional (lsr_net_profile_exists): True 11762 1726853274.38786: handler run complete 11762 1726853274.38805: attempt loop complete, returning result 11762 1726853274.38870: _execute() done 11762 1726853274.38875: dumping result to json 11762 1726853274.38878: done dumping result, returning 11762 1726853274.38880: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.1' [02083763-bbaf-d845-03d0-0000000004e8] 11762 1726853274.38883: sending task result for task 02083763-bbaf-d845-03d0-0000000004e8 11762 1726853274.38951: done sending task result for task 02083763-bbaf-d845-03d0-0000000004e8 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853274.39120: no more pending results, returning what we have 11762 1726853274.39125: results queue empty 11762 1726853274.39125: checking for any_errors_fatal 11762 1726853274.39135: done checking for any_errors_fatal 11762 1726853274.39136: checking for max_fail_percentage 11762 1726853274.39138: done checking for max_fail_percentage 11762 1726853274.39139: checking to see if all hosts have failed and the running result is not ok 11762 1726853274.39140: done checking to see if all hosts have failed 11762 1726853274.39141: getting the remaining hosts for this loop 11762 1726853274.39146: done getting the remaining hosts for this loop 11762 1726853274.39150: getting the next task for host managed_node2 11762 1726853274.39159: done getting next task for host managed_node2 11762 1726853274.39162: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11762 1726853274.39167: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853274.39174: getting variables 11762 1726853274.39175: in VariableManager get_vars() 11762 1726853274.39210: Calling all_inventory to load vars for managed_node2 11762 1726853274.39213: Calling groups_inventory to load vars for managed_node2 11762 1726853274.39216: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853274.39227: Calling all_plugins_play to load vars for managed_node2 11762 1726853274.39230: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853274.39232: Calling groups_plugins_play to load vars for managed_node2 11762 1726853274.39788: WORKER PROCESS EXITING 11762 1726853274.40924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853274.42602: done with get_vars() 11762 1726853274.42634: done getting variables 11762 1726853274.42712: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853274.42854: variable 'profile' from source: include params 11762 1726853274.42859: variable 'bond_port_profile' from source: include params 11762 1726853274.42930: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:27:54 -0400 (0:00:00.069) 0:00:24.860 ****** 11762 1726853274.42970: entering _queue_task() for managed_node2/assert 11762 1726853274.43406: worker is 1 (out of 1 available) 11762 1726853274.43426: exiting _queue_task() for managed_node2/assert 11762 1726853274.43438: done queuing things up, now waiting for results queue to drain 11762 1726853274.43440: waiting for pending results... 11762 1726853274.43687: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11762 1726853274.43846: in run() - task 02083763-bbaf-d845-03d0-0000000004e9 11762 1726853274.43881: variable 'ansible_search_path' from source: unknown 11762 1726853274.43890: variable 'ansible_search_path' from source: unknown 11762 1726853274.43935: calling self._execute() 11762 1726853274.44037: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.44052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.44065: variable 'omit' from source: magic vars 11762 1726853274.44465: variable 'ansible_distribution_major_version' from source: facts 11762 1726853274.44485: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853274.44497: variable 'omit' from source: magic vars 11762 1726853274.44568: variable 'omit' from source: magic vars 11762 1726853274.44682: variable 'profile' from source: include params 11762 1726853274.44692: variable 'bond_port_profile' from source: include params 11762 1726853274.44762: variable 'bond_port_profile' from source: include params 11762 1726853274.44795: variable 'omit' from source: magic vars 11762 1726853274.44878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853274.44904: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853274.44934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853274.45077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853274.45081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853274.45086: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853274.45088: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.45091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.45160: Set connection var ansible_timeout to 10 11762 1726853274.45169: Set connection var ansible_shell_type to sh 11762 1726853274.45182: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853274.45192: Set connection var ansible_shell_executable to /bin/sh 11762 1726853274.45213: Set connection var ansible_pipelining to False 11762 1726853274.45227: Set connection var ansible_connection to ssh 11762 1726853274.45260: variable 'ansible_shell_executable' from source: unknown 11762 1726853274.45268: variable 'ansible_connection' from source: unknown 11762 1726853274.45278: variable 'ansible_module_compression' from source: unknown 11762 1726853274.45319: variable 'ansible_shell_type' from source: unknown 11762 1726853274.45322: variable 'ansible_shell_executable' from source: unknown 11762 1726853274.45325: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.45327: variable 'ansible_pipelining' from source: unknown 11762 1726853274.45329: variable 'ansible_timeout' from source: unknown 11762 1726853274.45331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.45488: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853274.45506: variable 'omit' from source: magic vars 11762 1726853274.45538: starting attempt loop 11762 1726853274.45541: running the handler 11762 1726853274.45876: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11762 1726853274.45879: Evaluated conditional (lsr_net_profile_ansible_managed): True 11762 1726853274.45882: handler run complete 11762 1726853274.45884: attempt loop complete, returning result 11762 1726853274.45887: _execute() done 11762 1726853274.45889: dumping result to json 11762 1726853274.45892: done dumping result, returning 11762 1726853274.45894: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' [02083763-bbaf-d845-03d0-0000000004e9] 11762 1726853274.45896: sending task result for task 02083763-bbaf-d845-03d0-0000000004e9 11762 1726853274.45970: done sending task result for task 02083763-bbaf-d845-03d0-0000000004e9 11762 1726853274.45975: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853274.46029: no more pending results, returning what we have 11762 1726853274.46033: results queue empty 11762 1726853274.46034: checking for any_errors_fatal 11762 1726853274.46040: done checking for any_errors_fatal 11762 1726853274.46041: checking for max_fail_percentage 11762 1726853274.46046: done checking for max_fail_percentage 11762 1726853274.46047: checking to see if all hosts have failed and the running result is not ok 11762 1726853274.46048: done checking to see if all hosts have failed 11762 1726853274.46049: getting the remaining hosts for this loop 11762 1726853274.46052: done getting the remaining hosts for this loop 11762 1726853274.46055: getting the next task for host managed_node2 11762 1726853274.46064: done getting next task for host managed_node2 11762 1726853274.46066: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11762 1726853274.46072: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853274.46078: getting variables 11762 1726853274.46079: in VariableManager get_vars() 11762 1726853274.46118: Calling all_inventory to load vars for managed_node2 11762 1726853274.46121: Calling groups_inventory to load vars for managed_node2 11762 1726853274.46125: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853274.46138: Calling all_plugins_play to load vars for managed_node2 11762 1726853274.46141: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853274.46148: Calling groups_plugins_play to load vars for managed_node2 11762 1726853274.48045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853274.50210: done with get_vars() 11762 1726853274.50243: done getting variables 11762 1726853274.50313: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853274.50445: variable 'profile' from source: include params 11762 1726853274.50449: variable 'bond_port_profile' from source: include params 11762 1726853274.50509: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:27:54 -0400 (0:00:00.075) 0:00:24.935 ****** 11762 1726853274.50550: entering _queue_task() for managed_node2/assert 11762 1726853274.51098: worker is 1 (out of 1 available) 11762 1726853274.51109: exiting _queue_task() for managed_node2/assert 11762 1726853274.51120: done queuing things up, now waiting for results queue to drain 11762 1726853274.51122: waiting for pending results... 11762 1726853274.51520: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.1 11762 1726853274.51565: in run() - task 02083763-bbaf-d845-03d0-0000000004ea 11762 1726853274.51588: variable 'ansible_search_path' from source: unknown 11762 1726853274.51614: variable 'ansible_search_path' from source: unknown 11762 1726853274.51650: calling self._execute() 11762 1726853274.51832: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.51836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.51839: variable 'omit' from source: magic vars 11762 1726853274.52195: variable 'ansible_distribution_major_version' from source: facts 11762 1726853274.52213: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853274.52226: variable 'omit' from source: magic vars 11762 1726853274.52290: variable 'omit' from source: magic vars 11762 1726853274.52404: variable 'profile' from source: include params 11762 1726853274.52419: variable 'bond_port_profile' from source: include params 11762 1726853274.52492: variable 'bond_port_profile' from source: include params 11762 1726853274.52522: variable 'omit' from source: magic vars 11762 1726853274.52569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853274.52630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853274.52652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853274.52704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853274.52708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853274.52737: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853274.52747: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.52777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.52877: Set connection var ansible_timeout to 10 11762 1726853274.52887: Set connection var ansible_shell_type to sh 11762 1726853274.52898: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853274.52921: Set connection var ansible_shell_executable to /bin/sh 11762 1726853274.52926: Set connection var ansible_pipelining to False 11762 1726853274.52955: Set connection var ansible_connection to ssh 11762 1726853274.52970: variable 'ansible_shell_executable' from source: unknown 11762 1726853274.52980: variable 'ansible_connection' from source: unknown 11762 1726853274.53030: variable 'ansible_module_compression' from source: unknown 11762 1726853274.53033: variable 'ansible_shell_type' from source: unknown 11762 1726853274.53035: variable 'ansible_shell_executable' from source: unknown 11762 1726853274.53038: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.53040: variable 'ansible_pipelining' from source: unknown 11762 1726853274.53042: variable 'ansible_timeout' from source: unknown 11762 1726853274.53044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.53214: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853274.53229: variable 'omit' from source: magic vars 11762 1726853274.53244: starting attempt loop 11762 1726853274.53287: running the handler 11762 1726853274.53379: variable 'lsr_net_profile_fingerprint' from source: set_fact 11762 1726853274.53388: Evaluated conditional (lsr_net_profile_fingerprint): True 11762 1726853274.53462: handler run complete 11762 1726853274.53465: attempt loop complete, returning result 11762 1726853274.53467: _execute() done 11762 1726853274.53470: dumping result to json 11762 1726853274.53474: done dumping result, returning 11762 1726853274.53476: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.1 [02083763-bbaf-d845-03d0-0000000004ea] 11762 1726853274.53478: sending task result for task 02083763-bbaf-d845-03d0-0000000004ea 11762 1726853274.53625: done sending task result for task 02083763-bbaf-d845-03d0-0000000004ea 11762 1726853274.53628: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853274.53683: no more pending results, returning what we have 11762 1726853274.53688: results queue empty 11762 1726853274.53688: checking for any_errors_fatal 11762 1726853274.53696: done checking for any_errors_fatal 11762 1726853274.53697: checking for max_fail_percentage 11762 1726853274.53700: done checking for max_fail_percentage 11762 1726853274.53701: checking to see if all hosts have failed and the running result is not ok 11762 1726853274.53701: done checking to see if all hosts have failed 11762 1726853274.53703: getting the remaining hosts for this loop 11762 1726853274.53705: done getting the remaining hosts for this loop 11762 1726853274.53708: getting the next task for host managed_node2 11762 1726853274.53720: done getting next task for host managed_node2 11762 1726853274.53723: ^ task is: TASK: ** TEST check bond settings 11762 1726853274.53727: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853274.53731: getting variables 11762 1726853274.53733: in VariableManager get_vars() 11762 1726853274.53774: Calling all_inventory to load vars for managed_node2 11762 1726853274.53777: Calling groups_inventory to load vars for managed_node2 11762 1726853274.53781: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853274.53793: Calling all_plugins_play to load vars for managed_node2 11762 1726853274.53797: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853274.53800: Calling groups_plugins_play to load vars for managed_node2 11762 1726853274.56141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853274.59466: done with get_vars() 11762 1726853274.59707: done getting variables 11762 1726853274.59772: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Friday 20 September 2024 13:27:54 -0400 (0:00:00.093) 0:00:25.030 ****** 11762 1726853274.60017: entering _queue_task() for managed_node2/command 11762 1726853274.60711: worker is 1 (out of 1 available) 11762 1726853274.60727: exiting _queue_task() for managed_node2/command 11762 1726853274.60738: done queuing things up, now waiting for results queue to drain 11762 1726853274.60740: waiting for pending results... 11762 1726853274.61166: running TaskExecutor() for managed_node2/TASK: ** TEST check bond settings 11762 1726853274.61400: in run() - task 02083763-bbaf-d845-03d0-000000000400 11762 1726853274.61422: variable 'ansible_search_path' from source: unknown 11762 1726853274.61429: variable 'ansible_search_path' from source: unknown 11762 1726853274.61470: variable 'bond_options_to_assert' from source: play vars 11762 1726853274.62277: variable 'bond_options_to_assert' from source: play vars 11762 1726853274.62458: variable 'omit' from source: magic vars 11762 1726853274.62708: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.62716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.62727: variable 'omit' from source: magic vars 11762 1726853274.63194: variable 'ansible_distribution_major_version' from source: facts 11762 1726853274.63317: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853274.63323: variable 'omit' from source: magic vars 11762 1726853274.63378: variable 'omit' from source: magic vars 11762 1726853274.63866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853274.69997: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853274.70063: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853274.70110: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853274.70143: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853274.70173: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853274.70669: variable 'controller_device' from source: play vars 11762 1726853274.70674: variable 'bond_opt' from source: unknown 11762 1726853274.70701: variable 'omit' from source: magic vars 11762 1726853274.70731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853274.70764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853274.71124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853274.71128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853274.71130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853274.71133: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853274.71135: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.71137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.71235: Set connection var ansible_timeout to 10 11762 1726853274.71238: Set connection var ansible_shell_type to sh 11762 1726853274.71244: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853274.71286: Set connection var ansible_shell_executable to /bin/sh 11762 1726853274.71289: Set connection var ansible_pipelining to False 11762 1726853274.71292: Set connection var ansible_connection to ssh 11762 1726853274.71779: variable 'ansible_shell_executable' from source: unknown 11762 1726853274.71783: variable 'ansible_connection' from source: unknown 11762 1726853274.71785: variable 'ansible_module_compression' from source: unknown 11762 1726853274.71787: variable 'ansible_shell_type' from source: unknown 11762 1726853274.71795: variable 'ansible_shell_executable' from source: unknown 11762 1726853274.71797: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853274.71799: variable 'ansible_pipelining' from source: unknown 11762 1726853274.71802: variable 'ansible_timeout' from source: unknown 11762 1726853274.71804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853274.71826: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853274.71836: variable 'omit' from source: magic vars 11762 1726853274.71841: starting attempt loop 11762 1726853274.71844: running the handler 11762 1726853274.72015: _low_level_execute_command(): starting 11762 1726853274.72018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853274.73794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853274.73800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853274.73890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853274.74000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853274.75767: stdout chunk (state=3): >>>/root <<< 11762 1726853274.75907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853274.75961: stderr chunk (state=3): >>><<< 11762 1726853274.75964: stdout chunk (state=3): >>><<< 11762 1726853274.75994: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853274.76011: _low_level_execute_command(): starting 11762 1726853274.76023: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796 `" && echo ansible-tmp-1726853274.7599413-12895-40059322375796="` echo /root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796 `" ) && sleep 0' 11762 1726853274.77288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853274.77493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853274.77886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853274.78287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853274.80338: stdout chunk (state=3): >>>ansible-tmp-1726853274.7599413-12895-40059322375796=/root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796 <<< 11762 1726853274.80485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853274.80490: stdout chunk (state=3): >>><<< 11762 1726853274.80496: stderr chunk (state=3): >>><<< 11762 1726853274.80514: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853274.7599413-12895-40059322375796=/root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853274.80548: variable 'ansible_module_compression' from source: unknown 11762 1726853274.80597: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853274.80637: variable 'ansible_facts' from source: unknown 11762 1726853274.80811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796/AnsiballZ_command.py 11762 1726853274.81229: Sending initial data 11762 1726853274.81232: Sent initial data (155 bytes) 11762 1726853274.82547: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853274.82551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853274.82786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853274.84432: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853274.84514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853274.84626: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp1h_zjuar /root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796/AnsiballZ_command.py <<< 11762 1726853274.84630: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796/AnsiballZ_command.py" <<< 11762 1726853274.84699: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp1h_zjuar" to remote "/root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796/AnsiballZ_command.py" <<< 11762 1726853274.86578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853274.86582: stdout chunk (state=3): >>><<< 11762 1726853274.86585: stderr chunk (state=3): >>><<< 11762 1726853274.86587: done transferring module to remote 11762 1726853274.86708: _low_level_execute_command(): starting 11762 1726853274.86711: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796/ /root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796/AnsiballZ_command.py && sleep 0' 11762 1726853274.88587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853274.88591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853274.88594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853274.88597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853274.88603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853274.88605: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853274.88607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853274.88609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853274.88611: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853274.88805: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853274.89386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853274.89659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853274.91605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853274.91609: stdout chunk (state=3): >>><<< 11762 1726853274.91615: stderr chunk (state=3): >>><<< 11762 1726853274.91634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853274.91637: _low_level_execute_command(): starting 11762 1726853274.91662: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796/AnsiballZ_command.py && sleep 0' 11762 1726853274.93239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853274.93382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853274.93386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853274.93754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.09935: stdout chunk (state=3): >>> {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 13:27:55.095203", "end": "2024-09-20 13:27:55.098356", "delta": "0:00:00.003153", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853275.11789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853275.11793: stderr chunk (state=3): >>><<< 11762 1726853275.11795: stdout chunk (state=3): >>><<< 11762 1726853275.11827: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 13:27:55.095203", "end": "2024-09-20 13:27:55.098356", "delta": "0:00:00.003153", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853275.11894: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853275.11902: _low_level_execute_command(): starting 11762 1726853275.11904: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853274.7599413-12895-40059322375796/ > /dev/null 2>&1 && sleep 0' 11762 1726853275.12548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853275.12563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853275.12583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.12601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853275.12618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853275.12725: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.12737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.12844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.14847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.15465: stdout chunk (state=3): >>><<< 11762 1726853275.15469: stderr chunk (state=3): >>><<< 11762 1726853275.15474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853275.15477: handler run complete 11762 1726853275.15479: Evaluated conditional (False): False 11762 1726853275.15481: variable 'bond_opt' from source: unknown 11762 1726853275.15483: variable 'result' from source: unknown 11762 1726853275.15877: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853275.15881: attempt loop complete, returning result 11762 1726853275.15885: variable 'bond_opt' from source: unknown 11762 1726853275.15887: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'mode', 'value': '802.3ad'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "802.3ad" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.003153", "end": "2024-09-20 13:27:55.098356", "rc": 0, "start": "2024-09-20 13:27:55.095203" } STDOUT: 802.3ad 4 11762 1726853275.16476: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853275.16480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853275.16483: variable 'omit' from source: magic vars 11762 1726853275.16486: variable 'ansible_distribution_major_version' from source: facts 11762 1726853275.16876: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853275.16880: variable 'omit' from source: magic vars 11762 1726853275.16882: variable 'omit' from source: magic vars 11762 1726853275.16884: variable 'controller_device' from source: play vars 11762 1726853275.16886: variable 'bond_opt' from source: unknown 11762 1726853275.17200: variable 'omit' from source: magic vars 11762 1726853275.17245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853275.17262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853275.17278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853275.17299: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853275.17307: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853275.17315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853275.17403: Set connection var ansible_timeout to 10 11762 1726853275.17776: Set connection var ansible_shell_type to sh 11762 1726853275.17779: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853275.17781: Set connection var ansible_shell_executable to /bin/sh 11762 1726853275.17783: Set connection var ansible_pipelining to False 11762 1726853275.17785: Set connection var ansible_connection to ssh 11762 1726853275.17787: variable 'ansible_shell_executable' from source: unknown 11762 1726853275.17789: variable 'ansible_connection' from source: unknown 11762 1726853275.17791: variable 'ansible_module_compression' from source: unknown 11762 1726853275.17797: variable 'ansible_shell_type' from source: unknown 11762 1726853275.17799: variable 'ansible_shell_executable' from source: unknown 11762 1726853275.17801: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853275.17803: variable 'ansible_pipelining' from source: unknown 11762 1726853275.17805: variable 'ansible_timeout' from source: unknown 11762 1726853275.17807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853275.17809: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853275.17811: variable 'omit' from source: magic vars 11762 1726853275.17813: starting attempt loop 11762 1726853275.17815: running the handler 11762 1726853275.17816: _low_level_execute_command(): starting 11762 1726853275.17818: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853275.19502: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853275.19613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.19936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.20002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.21709: stdout chunk (state=3): >>>/root <<< 11762 1726853275.21908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.21922: stdout chunk (state=3): >>><<< 11762 1726853275.21935: stderr chunk (state=3): >>><<< 11762 1726853275.21959: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853275.21977: _low_level_execute_command(): starting 11762 1726853275.21992: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427 `" && echo ansible-tmp-1726853275.2196581-12895-109690536863427="` echo /root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427 `" ) && sleep 0' 11762 1726853275.23111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853275.23239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.23256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.23266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.23515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853275.23527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.23550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.24167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.26066: stdout chunk (state=3): >>>ansible-tmp-1726853275.2196581-12895-109690536863427=/root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427 <<< 11762 1726853275.26784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.26788: stdout chunk (state=3): >>><<< 11762 1726853275.26791: stderr chunk (state=3): >>><<< 11762 1726853275.26794: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853275.2196581-12895-109690536863427=/root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853275.26796: variable 'ansible_module_compression' from source: unknown 11762 1726853275.26798: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853275.26800: variable 'ansible_facts' from source: unknown 11762 1726853275.26802: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427/AnsiballZ_command.py 11762 1726853275.27322: Sending initial data 11762 1726853275.27345: Sent initial data (156 bytes) 11762 1726853275.28502: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.28506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853275.28508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.28558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853275.28570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.28887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.30635: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853275.30775: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853275.30801: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp29a910zp /root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427/AnsiballZ_command.py <<< 11762 1726853275.30829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427/AnsiballZ_command.py" <<< 11762 1726853275.30932: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp29a910zp" to remote "/root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427/AnsiballZ_command.py" <<< 11762 1726853275.30936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427/AnsiballZ_command.py" <<< 11762 1726853275.32146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.32270: stderr chunk (state=3): >>><<< 11762 1726853275.32279: stdout chunk (state=3): >>><<< 11762 1726853275.32281: done transferring module to remote 11762 1726853275.32284: _low_level_execute_command(): starting 11762 1726853275.32286: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427/ /root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427/AnsiballZ_command.py && sleep 0' 11762 1726853275.33435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853275.33440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853275.33528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.33532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853275.33566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.33570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.33689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853275.33695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.33793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.33869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.35781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.35807: stderr chunk (state=3): >>><<< 11762 1726853275.35816: stdout chunk (state=3): >>><<< 11762 1726853275.35878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853275.35881: _low_level_execute_command(): starting 11762 1726853275.35884: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427/AnsiballZ_command.py && sleep 0' 11762 1726853275.36991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.37080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.53097: stdout chunk (state=3): >>> {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-20 13:27:55.526661", "end": "2024-09-20 13:27:55.529903", "delta": "0:00:00.003242", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853275.54899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.54911: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 11762 1726853275.54969: stderr chunk (state=3): >>><<< 11762 1726853275.54993: stdout chunk (state=3): >>><<< 11762 1726853275.55016: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-20 13:27:55.526661", "end": "2024-09-20 13:27:55.529903", "delta": "0:00:00.003242", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853275.55057: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853275.55067: _low_level_execute_command(): starting 11762 1726853275.55077: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853275.2196581-12895-109690536863427/ > /dev/null 2>&1 && sleep 0' 11762 1726853275.55731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853275.55787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853275.55803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.55895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.55911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853275.55934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.55952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.56053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.58094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.58106: stdout chunk (state=3): >>><<< 11762 1726853275.58177: stderr chunk (state=3): >>><<< 11762 1726853275.58182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853275.58185: handler run complete 11762 1726853275.58187: Evaluated conditional (False): False 11762 1726853275.58366: variable 'bond_opt' from source: unknown 11762 1726853275.58380: variable 'result' from source: unknown 11762 1726853275.58399: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853275.58425: attempt loop complete, returning result 11762 1726853275.58452: variable 'bond_opt' from source: unknown 11762 1726853275.58676: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'ad_actor_sys_prio', 'value': '65535'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_sys_prio", "value": "65535" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio" ], "delta": "0:00:00.003242", "end": "2024-09-20 13:27:55.529903", "rc": 0, "start": "2024-09-20 13:27:55.526661" } STDOUT: 65535 11762 1726853275.58783: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853275.58786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853275.58789: variable 'omit' from source: magic vars 11762 1726853275.59124: variable 'ansible_distribution_major_version' from source: facts 11762 1726853275.59127: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853275.59129: variable 'omit' from source: magic vars 11762 1726853275.59131: variable 'omit' from source: magic vars 11762 1726853275.59142: variable 'controller_device' from source: play vars 11762 1726853275.59153: variable 'bond_opt' from source: unknown 11762 1726853275.59175: variable 'omit' from source: magic vars 11762 1726853275.59197: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853275.59207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853275.59215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853275.59238: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853275.59247: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853275.59253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853275.59323: Set connection var ansible_timeout to 10 11762 1726853275.59329: Set connection var ansible_shell_type to sh 11762 1726853275.59349: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853275.59357: Set connection var ansible_shell_executable to /bin/sh 11762 1726853275.59367: Set connection var ansible_pipelining to False 11762 1726853275.59379: Set connection var ansible_connection to ssh 11762 1726853275.59399: variable 'ansible_shell_executable' from source: unknown 11762 1726853275.59451: variable 'ansible_connection' from source: unknown 11762 1726853275.59454: variable 'ansible_module_compression' from source: unknown 11762 1726853275.59456: variable 'ansible_shell_type' from source: unknown 11762 1726853275.59457: variable 'ansible_shell_executable' from source: unknown 11762 1726853275.59459: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853275.59461: variable 'ansible_pipelining' from source: unknown 11762 1726853275.59462: variable 'ansible_timeout' from source: unknown 11762 1726853275.59464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853275.59529: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853275.59558: variable 'omit' from source: magic vars 11762 1726853275.59573: starting attempt loop 11762 1726853275.59656: running the handler 11762 1726853275.59662: _low_level_execute_command(): starting 11762 1726853275.59664: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853275.60282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853275.60296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853275.60310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.60339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853275.60392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.60467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853275.60490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.60511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.60889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.62601: stdout chunk (state=3): >>>/root <<< 11762 1726853275.62740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.62977: stderr chunk (state=3): >>><<< 11762 1726853275.62981: stdout chunk (state=3): >>><<< 11762 1726853275.62983: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853275.62986: _low_level_execute_command(): starting 11762 1726853275.62988: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384 `" && echo ansible-tmp-1726853275.6279976-12895-149484058654384="` echo /root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384 `" ) && sleep 0' 11762 1726853275.63427: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853275.63438: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853275.63444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.63460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853275.63477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853275.63481: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853275.63490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.63503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853275.63510: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853275.63516: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853275.63524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853275.63532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.63544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853275.63553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853275.63559: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853275.63568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.63632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853275.63644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.63664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.63761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.66077: stdout chunk (state=3): >>>ansible-tmp-1726853275.6279976-12895-149484058654384=/root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384 <<< 11762 1726853275.66081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.66084: stderr chunk (state=3): >>><<< 11762 1726853275.66086: stdout chunk (state=3): >>><<< 11762 1726853275.66088: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853275.6279976-12895-149484058654384=/root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853275.66090: variable 'ansible_module_compression' from source: unknown 11762 1726853275.66092: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853275.66098: variable 'ansible_facts' from source: unknown 11762 1726853275.66191: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384/AnsiballZ_command.py 11762 1726853275.66469: Sending initial data 11762 1726853275.66474: Sent initial data (156 bytes) 11762 1726853275.67123: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853275.67133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853275.67223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.67254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853275.67268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.67282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.67389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.69069: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11762 1726853275.69095: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853275.69399: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853275.69481: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpi7d513vm /root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384/AnsiballZ_command.py <<< 11762 1726853275.69484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384/AnsiballZ_command.py" <<< 11762 1726853275.69545: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpi7d513vm" to remote "/root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384/AnsiballZ_command.py" <<< 11762 1726853275.70556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.70628: stderr chunk (state=3): >>><<< 11762 1726853275.70631: stdout chunk (state=3): >>><<< 11762 1726853275.70689: done transferring module to remote 11762 1726853275.70697: _low_level_execute_command(): starting 11762 1726853275.70704: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384/ /root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384/AnsiballZ_command.py && sleep 0' 11762 1726853275.71319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853275.71366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853275.71370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.71374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853275.71383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.71394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853275.71440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.71485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853275.71543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.71548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.71615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.73546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.73551: stdout chunk (state=3): >>><<< 11762 1726853275.73553: stderr chunk (state=3): >>><<< 11762 1726853275.73578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853275.73667: _low_level_execute_command(): starting 11762 1726853275.73672: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384/AnsiballZ_command.py && sleep 0' 11762 1726853275.74286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853275.74301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853275.74328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853275.74351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853275.74380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853275.74437: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.74502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.74522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.74869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.91295: stdout chunk (state=3): >>> {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-20 13:27:55.908563", "end": "2024-09-20 13:27:55.911797", "delta": "0:00:00.003234", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853275.93207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853275.93212: stdout chunk (state=3): >>><<< 11762 1726853275.93214: stderr chunk (state=3): >>><<< 11762 1726853275.93407: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-20 13:27:55.908563", "end": "2024-09-20 13:27:55.911797", "delta": "0:00:00.003234", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853275.93412: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_system', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853275.93414: _low_level_execute_command(): starting 11762 1726853275.93416: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853275.6279976-12895-149484058654384/ > /dev/null 2>&1 && sleep 0' 11762 1726853275.95024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853275.95310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853275.95394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853275.95842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853275.97632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853275.97636: stdout chunk (state=3): >>><<< 11762 1726853275.97638: stderr chunk (state=3): >>><<< 11762 1726853275.97641: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853275.97646: handler run complete 11762 1726853275.97648: Evaluated conditional (False): False 11762 1726853275.97740: variable 'bond_opt' from source: unknown 11762 1726853275.98287: variable 'result' from source: unknown 11762 1726853275.98300: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853275.98310: attempt loop complete, returning result 11762 1726853275.98329: variable 'bond_opt' from source: unknown 11762 1726853275.98401: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'ad_actor_system', 'value': '00:00:5e:00:53:5d'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_system", "value": "00:00:5e:00:53:5d" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_system" ], "delta": "0:00:00.003234", "end": "2024-09-20 13:27:55.911797", "rc": 0, "start": "2024-09-20 13:27:55.908563" } STDOUT: 00:00:5e:00:53:5d 11762 1726853275.99013: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853275.99016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853275.99019: variable 'omit' from source: magic vars 11762 1726853275.99488: variable 'ansible_distribution_major_version' from source: facts 11762 1726853275.99494: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853275.99496: variable 'omit' from source: magic vars 11762 1726853275.99514: variable 'omit' from source: magic vars 11762 1726853275.99882: variable 'controller_device' from source: play vars 11762 1726853275.99887: variable 'bond_opt' from source: unknown 11762 1726853275.99907: variable 'omit' from source: magic vars 11762 1726853275.99928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853275.99939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853275.99944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853275.99956: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853275.99959: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853275.99961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853276.00051: Set connection var ansible_timeout to 10 11762 1726853276.00054: Set connection var ansible_shell_type to sh 11762 1726853276.00057: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853276.00059: Set connection var ansible_shell_executable to /bin/sh 11762 1726853276.00061: Set connection var ansible_pipelining to False 11762 1726853276.00063: Set connection var ansible_connection to ssh 11762 1726853276.00774: variable 'ansible_shell_executable' from source: unknown 11762 1726853276.00778: variable 'ansible_connection' from source: unknown 11762 1726853276.00780: variable 'ansible_module_compression' from source: unknown 11762 1726853276.00782: variable 'ansible_shell_type' from source: unknown 11762 1726853276.00785: variable 'ansible_shell_executable' from source: unknown 11762 1726853276.00787: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853276.00789: variable 'ansible_pipelining' from source: unknown 11762 1726853276.00791: variable 'ansible_timeout' from source: unknown 11762 1726853276.00793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853276.00802: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853276.00810: variable 'omit' from source: magic vars 11762 1726853276.00814: starting attempt loop 11762 1726853276.00817: running the handler 11762 1726853276.00824: _low_level_execute_command(): starting 11762 1726853276.00826: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853276.02682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853276.02687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.02689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853276.02692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853276.02694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.02989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.03555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.05139: stdout chunk (state=3): >>>/root <<< 11762 1726853276.05362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.05366: stdout chunk (state=3): >>><<< 11762 1726853276.05368: stderr chunk (state=3): >>><<< 11762 1726853276.05439: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.05445: _low_level_execute_command(): starting 11762 1726853276.05448: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591 `" && echo ansible-tmp-1726853276.0538847-12895-163713830861591="` echo /root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591 `" ) && sleep 0' 11762 1726853276.06859: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853276.07153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853276.07185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.07224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.07297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.09281: stdout chunk (state=3): >>>ansible-tmp-1726853276.0538847-12895-163713830861591=/root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591 <<< 11762 1726853276.09432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.09641: stdout chunk (state=3): >>><<< 11762 1726853276.09647: stderr chunk (state=3): >>><<< 11762 1726853276.09649: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853276.0538847-12895-163713830861591=/root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.09652: variable 'ansible_module_compression' from source: unknown 11762 1726853276.09654: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853276.09687: variable 'ansible_facts' from source: unknown 11762 1726853276.09827: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591/AnsiballZ_command.py 11762 1726853276.10108: Sending initial data 11762 1726853276.10118: Sent initial data (156 bytes) 11762 1726853276.11400: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.11551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.11613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.11778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.13454: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853276.13550: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853276.13945: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpe51ct7ys /root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591/AnsiballZ_command.py <<< 11762 1726853276.13949: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591/AnsiballZ_command.py" <<< 11762 1726853276.14109: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpe51ct7ys" to remote "/root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591/AnsiballZ_command.py" <<< 11762 1726853276.14112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591/AnsiballZ_command.py" <<< 11762 1726853276.15409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.15452: stderr chunk (state=3): >>><<< 11762 1726853276.15456: stdout chunk (state=3): >>><<< 11762 1726853276.15518: done transferring module to remote 11762 1726853276.15526: _low_level_execute_command(): starting 11762 1726853276.15531: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591/ /root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591/AnsiballZ_command.py && sleep 0' 11762 1726853276.16787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.16890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853276.16902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.16986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.17138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.19481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.19485: stdout chunk (state=3): >>><<< 11762 1726853276.19487: stderr chunk (state=3): >>><<< 11762 1726853276.19490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.19499: _low_level_execute_command(): starting 11762 1726853276.19501: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591/AnsiballZ_command.py && sleep 0' 11762 1726853276.20600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853276.20787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.20964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.21048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.37007: stdout chunk (state=3): >>> {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-20 13:27:56.365799", "end": "2024-09-20 13:27:56.369018", "delta": "0:00:00.003219", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853276.38736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853276.38740: stdout chunk (state=3): >>><<< 11762 1726853276.38747: stderr chunk (state=3): >>><<< 11762 1726853276.38779: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-20 13:27:56.365799", "end": "2024-09-20 13:27:56.369018", "delta": "0:00:00.003219", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853276.38807: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_select', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853276.38812: _low_level_execute_command(): starting 11762 1726853276.38825: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853276.0538847-12895-163713830861591/ > /dev/null 2>&1 && sleep 0' 11762 1726853276.40019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.40023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853276.40049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.40249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.42228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.42232: stdout chunk (state=3): >>><<< 11762 1726853276.42237: stderr chunk (state=3): >>><<< 11762 1726853276.42256: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.42264: handler run complete 11762 1726853276.42285: Evaluated conditional (False): False 11762 1726853276.42438: variable 'bond_opt' from source: unknown 11762 1726853276.42446: variable 'result' from source: unknown 11762 1726853276.42459: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853276.42466: attempt loop complete, returning result 11762 1726853276.42488: variable 'bond_opt' from source: unknown 11762 1726853276.42549: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'ad_select', 'value': 'stable'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_select", "value": "stable" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_select" ], "delta": "0:00:00.003219", "end": "2024-09-20 13:27:56.369018", "rc": 0, "start": "2024-09-20 13:27:56.365799" } STDOUT: stable 0 11762 1726853276.42694: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853276.42698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853276.42700: variable 'omit' from source: magic vars 11762 1726853276.42987: variable 'ansible_distribution_major_version' from source: facts 11762 1726853276.42990: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853276.42993: variable 'omit' from source: magic vars 11762 1726853276.42996: variable 'omit' from source: magic vars 11762 1726853276.42998: variable 'controller_device' from source: play vars 11762 1726853276.43000: variable 'bond_opt' from source: unknown 11762 1726853276.43018: variable 'omit' from source: magic vars 11762 1726853276.43040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853276.43093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853276.43099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853276.43102: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853276.43104: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853276.43106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853276.43154: Set connection var ansible_timeout to 10 11762 1726853276.43158: Set connection var ansible_shell_type to sh 11762 1726853276.43163: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853276.43200: Set connection var ansible_shell_executable to /bin/sh 11762 1726853276.43208: Set connection var ansible_pipelining to False 11762 1726853276.43211: Set connection var ansible_connection to ssh 11762 1726853276.43213: variable 'ansible_shell_executable' from source: unknown 11762 1726853276.43215: variable 'ansible_connection' from source: unknown 11762 1726853276.43218: variable 'ansible_module_compression' from source: unknown 11762 1726853276.43220: variable 'ansible_shell_type' from source: unknown 11762 1726853276.43222: variable 'ansible_shell_executable' from source: unknown 11762 1726853276.43224: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853276.43226: variable 'ansible_pipelining' from source: unknown 11762 1726853276.43228: variable 'ansible_timeout' from source: unknown 11762 1726853276.43229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853276.43309: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853276.43316: variable 'omit' from source: magic vars 11762 1726853276.43319: starting attempt loop 11762 1726853276.43322: running the handler 11762 1726853276.43328: _low_level_execute_command(): starting 11762 1726853276.43330: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853276.43918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853276.43961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853276.43966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853276.43969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853276.43973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853276.43975: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853276.43977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.43998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853276.44001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853276.44076: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.44091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.44288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.45940: stdout chunk (state=3): >>>/root <<< 11762 1726853276.46159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.46163: stdout chunk (state=3): >>><<< 11762 1726853276.46285: stderr chunk (state=3): >>><<< 11762 1726853276.46289: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.46292: _low_level_execute_command(): starting 11762 1726853276.46294: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742 `" && echo ansible-tmp-1726853276.4619424-12895-188209541723742="` echo /root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742 `" ) && sleep 0' 11762 1726853276.46916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853276.46932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853276.46951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853276.46969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853276.46989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853276.47029: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853276.47045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853276.47086: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.47147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853276.47179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.47219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.47292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.49342: stdout chunk (state=3): >>>ansible-tmp-1726853276.4619424-12895-188209541723742=/root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742 <<< 11762 1726853276.49506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.49509: stdout chunk (state=3): >>><<< 11762 1726853276.49512: stderr chunk (state=3): >>><<< 11762 1726853276.49626: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853276.4619424-12895-188209541723742=/root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.49629: variable 'ansible_module_compression' from source: unknown 11762 1726853276.49631: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853276.49633: variable 'ansible_facts' from source: unknown 11762 1726853276.49708: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742/AnsiballZ_command.py 11762 1726853276.49877: Sending initial data 11762 1726853276.49885: Sent initial data (156 bytes) 11762 1726853276.50465: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853276.50482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853276.50495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853276.50520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853276.50589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.50638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853276.50656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.50682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.50790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.52496: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853276.52594: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853276.52675: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpeddeh62l /root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742/AnsiballZ_command.py <<< 11762 1726853276.52679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742/AnsiballZ_command.py" <<< 11762 1726853276.52754: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpeddeh62l" to remote "/root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742/AnsiballZ_command.py" <<< 11762 1726853276.53684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.53721: stderr chunk (state=3): >>><<< 11762 1726853276.53738: stdout chunk (state=3): >>><<< 11762 1726853276.53814: done transferring module to remote 11762 1726853276.53828: _low_level_execute_command(): starting 11762 1726853276.53847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742/ /root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742/AnsiballZ_command.py && sleep 0' 11762 1726853276.54827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.54900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.56973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.56977: stdout chunk (state=3): >>><<< 11762 1726853276.56979: stderr chunk (state=3): >>><<< 11762 1726853276.57005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.57085: _low_level_execute_command(): starting 11762 1726853276.57088: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742/AnsiballZ_command.py && sleep 0' 11762 1726853276.57664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853276.57687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853276.57746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853276.57759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.57844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853276.57894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.57992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.74790: stdout chunk (state=3): >>> {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-20 13:27:56.742287", "end": "2024-09-20 13:27:56.745486", "delta": "0:00:00.003199", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853276.76368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.76388: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 11762 1726853276.76450: stderr chunk (state=3): >>><<< 11762 1726853276.76469: stdout chunk (state=3): >>><<< 11762 1726853276.76501: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-20 13:27:56.742287", "end": "2024-09-20 13:27:56.745486", "delta": "0:00:00.003199", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853276.76533: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_user_port_key', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853276.76547: _low_level_execute_command(): starting 11762 1726853276.76557: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853276.4619424-12895-188209541723742/ > /dev/null 2>&1 && sleep 0' 11762 1726853276.77200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853276.77226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853276.77246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853276.77265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853276.77341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.77397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853276.77416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.77451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.77642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.79596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.79616: stdout chunk (state=3): >>><<< 11762 1726853276.79619: stderr chunk (state=3): >>><<< 11762 1726853276.79777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.79780: handler run complete 11762 1726853276.79782: Evaluated conditional (False): False 11762 1726853276.79822: variable 'bond_opt' from source: unknown 11762 1726853276.79834: variable 'result' from source: unknown 11762 1726853276.79855: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853276.79873: attempt loop complete, returning result 11762 1726853276.79896: variable 'bond_opt' from source: unknown 11762 1726853276.79964: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'ad_user_port_key', 'value': '1023'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_user_port_key", "value": "1023" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key" ], "delta": "0:00:00.003199", "end": "2024-09-20 13:27:56.745486", "rc": 0, "start": "2024-09-20 13:27:56.742287" } STDOUT: 1023 11762 1726853276.80276: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853276.80279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853276.80282: variable 'omit' from source: magic vars 11762 1726853276.80436: variable 'ansible_distribution_major_version' from source: facts 11762 1726853276.80450: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853276.80457: variable 'omit' from source: magic vars 11762 1726853276.80474: variable 'omit' from source: magic vars 11762 1726853276.81017: variable 'controller_device' from source: play vars 11762 1726853276.81021: variable 'bond_opt' from source: unknown 11762 1726853276.81041: variable 'omit' from source: magic vars 11762 1726853276.81063: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853276.81072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853276.81084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853276.81105: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853276.81108: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853276.81112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853276.81431: Set connection var ansible_timeout to 10 11762 1726853276.81434: Set connection var ansible_shell_type to sh 11762 1726853276.81436: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853276.81438: Set connection var ansible_shell_executable to /bin/sh 11762 1726853276.81440: Set connection var ansible_pipelining to False 11762 1726853276.81445: Set connection var ansible_connection to ssh 11762 1726853276.81447: variable 'ansible_shell_executable' from source: unknown 11762 1726853276.81449: variable 'ansible_connection' from source: unknown 11762 1726853276.81451: variable 'ansible_module_compression' from source: unknown 11762 1726853276.81453: variable 'ansible_shell_type' from source: unknown 11762 1726853276.81455: variable 'ansible_shell_executable' from source: unknown 11762 1726853276.81456: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853276.81458: variable 'ansible_pipelining' from source: unknown 11762 1726853276.81460: variable 'ansible_timeout' from source: unknown 11762 1726853276.81462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853276.81601: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853276.81604: variable 'omit' from source: magic vars 11762 1726853276.81607: starting attempt loop 11762 1726853276.81609: running the handler 11762 1726853276.81610: _low_level_execute_command(): starting 11762 1726853276.81612: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853276.82377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853276.82381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853276.82384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853276.82386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853276.82389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853276.82578: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.82582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.82584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.84254: stdout chunk (state=3): >>>/root <<< 11762 1726853276.84497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.84501: stdout chunk (state=3): >>><<< 11762 1726853276.84512: stderr chunk (state=3): >>><<< 11762 1726853276.84604: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.84610: _low_level_execute_command(): starting 11762 1726853276.84616: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271 `" && echo ansible-tmp-1726853276.8460062-12895-77703038749271="` echo /root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271 `" ) && sleep 0' 11762 1726853276.85959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.86031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.88075: stdout chunk (state=3): >>>ansible-tmp-1726853276.8460062-12895-77703038749271=/root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271 <<< 11762 1726853276.88222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.88225: stdout chunk (state=3): >>><<< 11762 1726853276.88227: stderr chunk (state=3): >>><<< 11762 1726853276.88240: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853276.8460062-12895-77703038749271=/root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.88386: variable 'ansible_module_compression' from source: unknown 11762 1726853276.88390: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853276.88392: variable 'ansible_facts' from source: unknown 11762 1726853276.88411: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271/AnsiballZ_command.py 11762 1726853276.88523: Sending initial data 11762 1726853276.88586: Sent initial data (155 bytes) 11762 1726853276.89247: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853276.89282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853276.89377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.89426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.89828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.91302: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853276.91398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853276.91488: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp7cpplel1 /root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271/AnsiballZ_command.py <<< 11762 1726853276.91497: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271/AnsiballZ_command.py" <<< 11762 1726853276.91584: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp7cpplel1" to remote "/root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271/AnsiballZ_command.py" <<< 11762 1726853276.92487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.92568: stderr chunk (state=3): >>><<< 11762 1726853276.92580: stdout chunk (state=3): >>><<< 11762 1726853276.92653: done transferring module to remote 11762 1726853276.92665: _low_level_execute_command(): starting 11762 1726853276.92675: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271/ /root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271/AnsiballZ_command.py && sleep 0' 11762 1726853276.93309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853276.93387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.93432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853276.93460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.93489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.93613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853276.95684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853276.95688: stdout chunk (state=3): >>><<< 11762 1726853276.95690: stderr chunk (state=3): >>><<< 11762 1726853276.95693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853276.95696: _low_level_execute_command(): starting 11762 1726853276.95698: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271/AnsiballZ_command.py && sleep 0' 11762 1726853276.96221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853276.96229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853276.96238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853276.96253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853276.96266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853276.96275: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853276.96284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.96297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853276.96305: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853276.96312: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853276.96319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853276.96344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853276.96349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853276.96351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853276.96354: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853276.96454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853276.96457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853276.96460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853276.96462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853276.96567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.12711: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-20 13:27:57.120323", "end": "2024-09-20 13:27:57.123607", "delta": "0:00:00.003284", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853277.14292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853277.14324: stderr chunk (state=3): >>><<< 11762 1726853277.14333: stdout chunk (state=3): >>><<< 11762 1726853277.14390: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-20 13:27:57.120323", "end": "2024-09-20 13:27:57.123607", "delta": "0:00:00.003284", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853277.14536: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/all_slaves_active', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853277.14539: _low_level_execute_command(): starting 11762 1726853277.14541: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853276.8460062-12895-77703038749271/ > /dev/null 2>&1 && sleep 0' 11762 1726853277.15755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853277.15784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853277.15802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853277.16069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853277.16089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.16126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.16281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.18196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.18391: stderr chunk (state=3): >>><<< 11762 1726853277.18395: stdout chunk (state=3): >>><<< 11762 1726853277.18412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853277.18423: handler run complete 11762 1726853277.18454: Evaluated conditional (False): False 11762 1726853277.18896: variable 'bond_opt' from source: unknown 11762 1726853277.18900: variable 'result' from source: unknown 11762 1726853277.18902: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853277.18904: attempt loop complete, returning result 11762 1726853277.18935: variable 'bond_opt' from source: unknown 11762 1726853277.19138: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'all_slaves_active', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "all_slaves_active", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/all_slaves_active" ], "delta": "0:00:00.003284", "end": "2024-09-20 13:27:57.123607", "rc": 0, "start": "2024-09-20 13:27:57.120323" } STDOUT: 1 11762 1726853277.19657: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853277.19660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853277.19662: variable 'omit' from source: magic vars 11762 1726853277.19989: variable 'ansible_distribution_major_version' from source: facts 11762 1726853277.20000: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853277.20008: variable 'omit' from source: magic vars 11762 1726853277.20028: variable 'omit' from source: magic vars 11762 1726853277.20427: variable 'controller_device' from source: play vars 11762 1726853277.20431: variable 'bond_opt' from source: unknown 11762 1726853277.20476: variable 'omit' from source: magic vars 11762 1726853277.20480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853277.20489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853277.20496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853277.20512: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853277.20541: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853277.20547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853277.20647: Set connection var ansible_timeout to 10 11762 1726853277.20654: Set connection var ansible_shell_type to sh 11762 1726853277.20656: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853277.20659: Set connection var ansible_shell_executable to /bin/sh 11762 1726853277.20661: Set connection var ansible_pipelining to False 11762 1726853277.20664: Set connection var ansible_connection to ssh 11762 1726853277.20666: variable 'ansible_shell_executable' from source: unknown 11762 1726853277.20668: variable 'ansible_connection' from source: unknown 11762 1726853277.20670: variable 'ansible_module_compression' from source: unknown 11762 1726853277.20675: variable 'ansible_shell_type' from source: unknown 11762 1726853277.20677: variable 'ansible_shell_executable' from source: unknown 11762 1726853277.20679: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853277.20681: variable 'ansible_pipelining' from source: unknown 11762 1726853277.20683: variable 'ansible_timeout' from source: unknown 11762 1726853277.20685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853277.20780: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853277.20783: variable 'omit' from source: magic vars 11762 1726853277.20790: starting attempt loop 11762 1726853277.20793: running the handler 11762 1726853277.20795: _low_level_execute_command(): starting 11762 1726853277.20797: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853277.21414: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853277.21422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853277.21433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853277.21465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853277.21476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853277.21551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.21558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.21663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.23532: stdout chunk (state=3): >>>/root <<< 11762 1726853277.23767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.23773: stdout chunk (state=3): >>><<< 11762 1726853277.23776: stderr chunk (state=3): >>><<< 11762 1726853277.23778: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853277.23780: _low_level_execute_command(): starting 11762 1726853277.23782: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478 `" && echo ansible-tmp-1726853277.236882-12895-3525484715478="` echo /root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478 `" ) && sleep 0' 11762 1726853277.24390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853277.24405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853277.24489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.24504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.24623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.26985: stdout chunk (state=3): >>>ansible-tmp-1726853277.236882-12895-3525484715478=/root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478 <<< 11762 1726853277.27004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.27018: stdout chunk (state=3): >>><<< 11762 1726853277.27027: stderr chunk (state=3): >>><<< 11762 1726853277.27049: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853277.236882-12895-3525484715478=/root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853277.27129: variable 'ansible_module_compression' from source: unknown 11762 1726853277.27150: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853277.27176: variable 'ansible_facts' from source: unknown 11762 1726853277.27252: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478/AnsiballZ_command.py 11762 1726853277.27441: Sending initial data 11762 1726853277.27444: Sent initial data (153 bytes) 11762 1726853277.28105: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853277.28170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.28216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.28400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.30395: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853277.30400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853277.30505: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpppnbibix /root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478/AnsiballZ_command.py <<< 11762 1726853277.30511: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478/AnsiballZ_command.py" <<< 11762 1726853277.30557: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpppnbibix" to remote "/root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478/AnsiballZ_command.py" <<< 11762 1726853277.31595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.31677: stderr chunk (state=3): >>><<< 11762 1726853277.31681: stdout chunk (state=3): >>><<< 11762 1726853277.31683: done transferring module to remote 11762 1726853277.31686: _low_level_execute_command(): starting 11762 1726853277.31695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478/ /root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478/AnsiballZ_command.py && sleep 0' 11762 1726853277.32359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853277.32466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.32516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.32588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.34581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.34586: stdout chunk (state=3): >>><<< 11762 1726853277.34588: stderr chunk (state=3): >>><<< 11762 1726853277.34608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853277.34611: _low_level_execute_command(): starting 11762 1726853277.34614: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478/AnsiballZ_command.py && sleep 0' 11762 1726853277.35288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853277.35396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.35416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.35533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.51915: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-20 13:27:57.512046", "end": "2024-09-20 13:27:57.515377", "delta": "0:00:00.003331", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853277.53576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853277.53581: stdout chunk (state=3): >>><<< 11762 1726853277.53583: stderr chunk (state=3): >>><<< 11762 1726853277.53586: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-20 13:27:57.512046", "end": "2024-09-20 13:27:57.515377", "delta": "0:00:00.003331", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853277.53593: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/downdelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853277.53595: _low_level_execute_command(): starting 11762 1726853277.53598: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853277.236882-12895-3525484715478/ > /dev/null 2>&1 && sleep 0' 11762 1726853277.54145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853277.54164: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853277.54279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.54300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.54401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.56779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.56785: stdout chunk (state=3): >>><<< 11762 1726853277.56788: stderr chunk (state=3): >>><<< 11762 1726853277.56793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853277.56796: handler run complete 11762 1726853277.56798: Evaluated conditional (False): False 11762 1726853277.56799: variable 'bond_opt' from source: unknown 11762 1726853277.56801: variable 'result' from source: unknown 11762 1726853277.56803: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853277.56805: attempt loop complete, returning result 11762 1726853277.56806: variable 'bond_opt' from source: unknown 11762 1726853277.56808: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'downdelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "downdelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/downdelay" ], "delta": "0:00:00.003331", "end": "2024-09-20 13:27:57.515377", "rc": 0, "start": "2024-09-20 13:27:57.512046" } STDOUT: 0 11762 1726853277.57017: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853277.57033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853277.57048: variable 'omit' from source: magic vars 11762 1726853277.57234: variable 'ansible_distribution_major_version' from source: facts 11762 1726853277.57278: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853277.57281: variable 'omit' from source: magic vars 11762 1726853277.57284: variable 'omit' from source: magic vars 11762 1726853277.57505: variable 'controller_device' from source: play vars 11762 1726853277.57515: variable 'bond_opt' from source: unknown 11762 1726853277.57540: variable 'omit' from source: magic vars 11762 1726853277.57605: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853277.57608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853277.57611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853277.57613: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853277.57617: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853277.57625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853277.57717: Set connection var ansible_timeout to 10 11762 1726853277.57726: Set connection var ansible_shell_type to sh 11762 1726853277.57737: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853277.57828: Set connection var ansible_shell_executable to /bin/sh 11762 1726853277.57832: Set connection var ansible_pipelining to False 11762 1726853277.57835: Set connection var ansible_connection to ssh 11762 1726853277.57837: variable 'ansible_shell_executable' from source: unknown 11762 1726853277.57839: variable 'ansible_connection' from source: unknown 11762 1726853277.57841: variable 'ansible_module_compression' from source: unknown 11762 1726853277.57843: variable 'ansible_shell_type' from source: unknown 11762 1726853277.57844: variable 'ansible_shell_executable' from source: unknown 11762 1726853277.57846: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853277.57848: variable 'ansible_pipelining' from source: unknown 11762 1726853277.57850: variable 'ansible_timeout' from source: unknown 11762 1726853277.57852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853277.57989: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853277.58002: variable 'omit' from source: magic vars 11762 1726853277.58010: starting attempt loop 11762 1726853277.58016: running the handler 11762 1726853277.58040: _low_level_execute_command(): starting 11762 1726853277.58077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853277.58790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853277.58827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853277.58845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.58873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.58983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.60917: stdout chunk (state=3): >>>/root <<< 11762 1726853277.60937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.61039: stdout chunk (state=3): >>><<< 11762 1726853277.61043: stderr chunk (state=3): >>><<< 11762 1726853277.61045: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853277.61048: _low_level_execute_command(): starting 11762 1726853277.61050: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032 `" && echo ansible-tmp-1726853277.6100667-12895-32558814239032="` echo /root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032 `" ) && sleep 0' 11762 1726853277.62707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.62792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.62901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.64983: stdout chunk (state=3): >>>ansible-tmp-1726853277.6100667-12895-32558814239032=/root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032 <<< 11762 1726853277.65155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.65243: stderr chunk (state=3): >>><<< 11762 1726853277.65275: stdout chunk (state=3): >>><<< 11762 1726853277.65329: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853277.6100667-12895-32558814239032=/root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853277.65580: variable 'ansible_module_compression' from source: unknown 11762 1726853277.65583: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853277.65586: variable 'ansible_facts' from source: unknown 11762 1726853277.65588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032/AnsiballZ_command.py 11762 1726853277.65989: Sending initial data 11762 1726853277.65992: Sent initial data (155 bytes) 11762 1726853277.67463: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853277.67541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853277.67562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.67585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.67692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.69413: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853277.69502: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853277.69791: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp49d1dfut /root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032/AnsiballZ_command.py <<< 11762 1726853277.69795: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032/AnsiballZ_command.py" <<< 11762 1726853277.69797: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp49d1dfut" to remote "/root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032/AnsiballZ_command.py" <<< 11762 1726853277.71043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.71106: stderr chunk (state=3): >>><<< 11762 1726853277.71110: stdout chunk (state=3): >>><<< 11762 1726853277.71130: done transferring module to remote 11762 1726853277.71139: _low_level_execute_command(): starting 11762 1726853277.71143: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032/ /root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032/AnsiballZ_command.py && sleep 0' 11762 1726853277.71750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853277.71760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853277.71774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853277.71789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853277.71882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853277.71924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.71937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.72005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.74004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.74008: stdout chunk (state=3): >>><<< 11762 1726853277.74017: stderr chunk (state=3): >>><<< 11762 1726853277.74112: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853277.74115: _low_level_execute_command(): starting 11762 1726853277.74118: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032/AnsiballZ_command.py && sleep 0' 11762 1726853277.74952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853277.74995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853277.75003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853277.75010: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853277.75086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853277.75102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.75116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.75228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.91609: stdout chunk (state=3): >>> {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-20 13:27:57.911654", "end": "2024-09-20 13:27:57.914992", "delta": "0:00:00.003338", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853277.93365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.93404: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 11762 1726853277.93408: stdout chunk (state=3): >>><<< 11762 1726853277.93410: stderr chunk (state=3): >>><<< 11762 1726853277.93429: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-20 13:27:57.911654", "end": "2024-09-20 13:27:57.914992", "delta": "0:00:00.003338", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853277.93467: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lacp_rate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853277.93481: _low_level_execute_command(): starting 11762 1726853277.93513: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853277.6100667-12895-32558814239032/ > /dev/null 2>&1 && sleep 0' 11762 1726853277.94158: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853277.94182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853277.94250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853277.94305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853277.94323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.94385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.94579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853277.96507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853277.96511: stdout chunk (state=3): >>><<< 11762 1726853277.96517: stderr chunk (state=3): >>><<< 11762 1726853277.96534: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853277.96537: handler run complete 11762 1726853277.96677: Evaluated conditional (False): False 11762 1726853277.96725: variable 'bond_opt' from source: unknown 11762 1726853277.96731: variable 'result' from source: unknown 11762 1726853277.96745: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853277.96753: attempt loop complete, returning result 11762 1726853277.96769: variable 'bond_opt' from source: unknown 11762 1726853277.96838: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'lacp_rate', 'value': 'slow'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lacp_rate", "value": "slow" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lacp_rate" ], "delta": "0:00:00.003338", "end": "2024-09-20 13:27:57.914992", "rc": 0, "start": "2024-09-20 13:27:57.911654" } STDOUT: slow 0 11762 1726853277.96979: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853277.96982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853277.96985: variable 'omit' from source: magic vars 11762 1726853277.97176: variable 'ansible_distribution_major_version' from source: facts 11762 1726853277.97179: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853277.97181: variable 'omit' from source: magic vars 11762 1726853277.97184: variable 'omit' from source: magic vars 11762 1726853277.97292: variable 'controller_device' from source: play vars 11762 1726853277.97295: variable 'bond_opt' from source: unknown 11762 1726853277.97375: variable 'omit' from source: magic vars 11762 1726853277.97378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853277.97381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853277.97387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853277.97390: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853277.97392: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853277.97394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853277.97437: Set connection var ansible_timeout to 10 11762 1726853277.97447: Set connection var ansible_shell_type to sh 11762 1726853277.97454: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853277.97459: Set connection var ansible_shell_executable to /bin/sh 11762 1726853277.97467: Set connection var ansible_pipelining to False 11762 1726853277.97474: Set connection var ansible_connection to ssh 11762 1726853277.97491: variable 'ansible_shell_executable' from source: unknown 11762 1726853277.97494: variable 'ansible_connection' from source: unknown 11762 1726853277.97497: variable 'ansible_module_compression' from source: unknown 11762 1726853277.97499: variable 'ansible_shell_type' from source: unknown 11762 1726853277.97501: variable 'ansible_shell_executable' from source: unknown 11762 1726853277.97503: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853277.97676: variable 'ansible_pipelining' from source: unknown 11762 1726853277.97679: variable 'ansible_timeout' from source: unknown 11762 1726853277.97681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853277.97684: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853277.97686: variable 'omit' from source: magic vars 11762 1726853277.97688: starting attempt loop 11762 1726853277.97690: running the handler 11762 1726853277.97691: _low_level_execute_command(): starting 11762 1726853277.97693: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853277.98269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853277.98282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853277.98316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853277.98323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853277.98330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853277.98385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853277.98431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853277.98450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853277.98463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853277.98575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.00314: stdout chunk (state=3): >>>/root <<< 11762 1726853278.00523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.00527: stdout chunk (state=3): >>><<< 11762 1726853278.00529: stderr chunk (state=3): >>><<< 11762 1726853278.00634: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.00640: _low_level_execute_command(): starting 11762 1726853278.00643: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901 `" && echo ansible-tmp-1726853278.0055287-12895-61458253607901="` echo /root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901 `" ) && sleep 0' 11762 1726853278.01867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853278.01879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853278.01889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853278.01902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853278.02192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.02198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.02207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.02322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.04477: stdout chunk (state=3): >>>ansible-tmp-1726853278.0055287-12895-61458253607901=/root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901 <<< 11762 1726853278.04559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.04627: stderr chunk (state=3): >>><<< 11762 1726853278.04668: stdout chunk (state=3): >>><<< 11762 1726853278.04690: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853278.0055287-12895-61458253607901=/root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.04714: variable 'ansible_module_compression' from source: unknown 11762 1726853278.04753: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853278.04976: variable 'ansible_facts' from source: unknown 11762 1726853278.04993: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901/AnsiballZ_command.py 11762 1726853278.05598: Sending initial data 11762 1726853278.05601: Sent initial data (155 bytes) 11762 1726853278.06392: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853278.06398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853278.06414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853278.06420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.06433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853278.06439: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853278.06459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.06522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.06527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.06681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.06783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.08611: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853278.08688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853278.08766: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp3qvikhq8 /root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901/AnsiballZ_command.py <<< 11762 1726853278.08769: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901/AnsiballZ_command.py" <<< 11762 1726853278.08833: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp3qvikhq8" to remote "/root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901/AnsiballZ_command.py" <<< 11762 1726853278.09804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.09836: stderr chunk (state=3): >>><<< 11762 1726853278.09840: stdout chunk (state=3): >>><<< 11762 1726853278.10125: done transferring module to remote 11762 1726853278.10128: _low_level_execute_command(): starting 11762 1726853278.10131: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901/ /root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901/AnsiballZ_command.py && sleep 0' 11762 1726853278.11999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.12116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.12390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.14178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.14202: stdout chunk (state=3): >>><<< 11762 1726853278.14205: stderr chunk (state=3): >>><<< 11762 1726853278.14222: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.14230: _low_level_execute_command(): starting 11762 1726853278.14239: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901/AnsiballZ_command.py && sleep 0' 11762 1726853278.15592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.15808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.15834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.15946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.32136: stdout chunk (state=3): >>> {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-20 13:27:58.316992", "end": "2024-09-20 13:27:58.320283", "delta": "0:00:00.003291", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853278.33831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853278.33842: stdout chunk (state=3): >>><<< 11762 1726853278.33857: stderr chunk (state=3): >>><<< 11762 1726853278.33891: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-20 13:27:58.316992", "end": "2024-09-20 13:27:58.320283", "delta": "0:00:00.003291", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853278.33925: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853278.33937: _low_level_execute_command(): starting 11762 1726853278.33946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853278.0055287-12895-61458253607901/ > /dev/null 2>&1 && sleep 0' 11762 1726853278.35036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.35247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.35357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.35480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.37433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.37437: stdout chunk (state=3): >>><<< 11762 1726853278.37449: stderr chunk (state=3): >>><<< 11762 1726853278.37479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.37483: handler run complete 11762 1726853278.37504: Evaluated conditional (False): False 11762 1726853278.37679: variable 'bond_opt' from source: unknown 11762 1726853278.37691: variable 'result' from source: unknown 11762 1726853278.37700: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853278.37718: attempt loop complete, returning result 11762 1726853278.37738: variable 'bond_opt' from source: unknown 11762 1726853278.37825: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'lp_interval', 'value': '128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lp_interval", "value": "128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lp_interval" ], "delta": "0:00:00.003291", "end": "2024-09-20 13:27:58.320283", "rc": 0, "start": "2024-09-20 13:27:58.316992" } STDOUT: 128 11762 1726853278.38322: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853278.38325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853278.38328: variable 'omit' from source: magic vars 11762 1726853278.38334: variable 'ansible_distribution_major_version' from source: facts 11762 1726853278.38337: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853278.38338: variable 'omit' from source: magic vars 11762 1726853278.38341: variable 'omit' from source: magic vars 11762 1726853278.38603: variable 'controller_device' from source: play vars 11762 1726853278.38607: variable 'bond_opt' from source: unknown 11762 1726853278.38609: variable 'omit' from source: magic vars 11762 1726853278.38611: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853278.38613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853278.38615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853278.38617: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853278.38620: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853278.38623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853278.38714: Set connection var ansible_timeout to 10 11762 1726853278.38807: Set connection var ansible_shell_type to sh 11762 1726853278.38903: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853278.38906: Set connection var ansible_shell_executable to /bin/sh 11762 1726853278.38909: Set connection var ansible_pipelining to False 11762 1726853278.38911: Set connection var ansible_connection to ssh 11762 1726853278.38913: variable 'ansible_shell_executable' from source: unknown 11762 1726853278.38915: variable 'ansible_connection' from source: unknown 11762 1726853278.38916: variable 'ansible_module_compression' from source: unknown 11762 1726853278.38918: variable 'ansible_shell_type' from source: unknown 11762 1726853278.38920: variable 'ansible_shell_executable' from source: unknown 11762 1726853278.38922: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853278.38924: variable 'ansible_pipelining' from source: unknown 11762 1726853278.38926: variable 'ansible_timeout' from source: unknown 11762 1726853278.38928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853278.39048: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853278.39138: variable 'omit' from source: magic vars 11762 1726853278.39142: starting attempt loop 11762 1726853278.39144: running the handler 11762 1726853278.39146: _low_level_execute_command(): starting 11762 1726853278.39148: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853278.39695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853278.39710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853278.39725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853278.39783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.39836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.39861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.39884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.40076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.41818: stdout chunk (state=3): >>>/root <<< 11762 1726853278.41946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.41983: stderr chunk (state=3): >>><<< 11762 1726853278.41987: stdout chunk (state=3): >>><<< 11762 1726853278.42098: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.42101: _low_level_execute_command(): starting 11762 1726853278.42104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609 `" && echo ansible-tmp-1726853278.4201403-12895-41677635576609="` echo /root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609 `" ) && sleep 0' 11762 1726853278.43092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.43304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.43408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.45444: stdout chunk (state=3): >>>ansible-tmp-1726853278.4201403-12895-41677635576609=/root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609 <<< 11762 1726853278.45618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.45621: stdout chunk (state=3): >>><<< 11762 1726853278.45876: stderr chunk (state=3): >>><<< 11762 1726853278.45880: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853278.4201403-12895-41677635576609=/root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.45883: variable 'ansible_module_compression' from source: unknown 11762 1726853278.45885: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853278.45887: variable 'ansible_facts' from source: unknown 11762 1726853278.45889: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609/AnsiballZ_command.py 11762 1726853278.45982: Sending initial data 11762 1726853278.45985: Sent initial data (155 bytes) 11762 1726853278.46654: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.46679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.46811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.46964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.48622: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853278.48834: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853278.48924: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpy4ers2hx /root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609/AnsiballZ_command.py <<< 11762 1726853278.48928: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609/AnsiballZ_command.py" <<< 11762 1726853278.49012: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpy4ers2hx" to remote "/root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609/AnsiballZ_command.py" <<< 11762 1726853278.50214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.50302: stderr chunk (state=3): >>><<< 11762 1726853278.50305: stdout chunk (state=3): >>><<< 11762 1726853278.50342: done transferring module to remote 11762 1726853278.50353: _low_level_execute_command(): starting 11762 1726853278.50358: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609/ /root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609/AnsiballZ_command.py && sleep 0' 11762 1726853278.51094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.51110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.51122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.51141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.51238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.53315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.53352: stderr chunk (state=3): >>><<< 11762 1726853278.53368: stdout chunk (state=3): >>><<< 11762 1726853278.53393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.53480: _low_level_execute_command(): starting 11762 1726853278.53484: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609/AnsiballZ_command.py && sleep 0' 11762 1726853278.54086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.54105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.54122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.54145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.54261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.70669: stdout chunk (state=3): >>> {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-20 13:27:58.701463", "end": "2024-09-20 13:27:58.704760", "delta": "0:00:00.003297", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853278.72386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853278.72390: stdout chunk (state=3): >>><<< 11762 1726853278.72393: stderr chunk (state=3): >>><<< 11762 1726853278.72395: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-20 13:27:58.701463", "end": "2024-09-20 13:27:58.704760", "delta": "0:00:00.003297", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853278.72417: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/miimon', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853278.72429: _low_level_execute_command(): starting 11762 1726853278.72574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853278.4201403-12895-41677635576609/ > /dev/null 2>&1 && sleep 0' 11762 1726853278.73675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853278.73681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.73684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853278.73686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.73861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.73865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.74030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.74087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.76035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.76118: stderr chunk (state=3): >>><<< 11762 1726853278.76122: stdout chunk (state=3): >>><<< 11762 1726853278.76139: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.76480: handler run complete 11762 1726853278.76483: Evaluated conditional (False): False 11762 1726853278.76578: variable 'bond_opt' from source: unknown 11762 1726853278.76631: variable 'result' from source: unknown 11762 1726853278.76679: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853278.76720: attempt loop complete, returning result 11762 1726853278.76879: variable 'bond_opt' from source: unknown 11762 1726853278.76908: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'miimon', 'value': '110'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "miimon", "value": "110" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/miimon" ], "delta": "0:00:00.003297", "end": "2024-09-20 13:27:58.704760", "rc": 0, "start": "2024-09-20 13:27:58.701463" } STDOUT: 110 11762 1726853278.77251: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853278.77254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853278.77299: variable 'omit' from source: magic vars 11762 1726853278.77458: variable 'ansible_distribution_major_version' from source: facts 11762 1726853278.77477: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853278.77487: variable 'omit' from source: magic vars 11762 1726853278.77505: variable 'omit' from source: magic vars 11762 1726853278.77687: variable 'controller_device' from source: play vars 11762 1726853278.77698: variable 'bond_opt' from source: unknown 11762 1726853278.77752: variable 'omit' from source: magic vars 11762 1726853278.77756: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853278.77764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853278.77777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853278.77799: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853278.77807: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853278.77814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853278.77901: Set connection var ansible_timeout to 10 11762 1726853278.77970: Set connection var ansible_shell_type to sh 11762 1726853278.77974: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853278.77976: Set connection var ansible_shell_executable to /bin/sh 11762 1726853278.77978: Set connection var ansible_pipelining to False 11762 1726853278.77980: Set connection var ansible_connection to ssh 11762 1726853278.77982: variable 'ansible_shell_executable' from source: unknown 11762 1726853278.77984: variable 'ansible_connection' from source: unknown 11762 1726853278.77985: variable 'ansible_module_compression' from source: unknown 11762 1726853278.77987: variable 'ansible_shell_type' from source: unknown 11762 1726853278.77989: variable 'ansible_shell_executable' from source: unknown 11762 1726853278.77995: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853278.78003: variable 'ansible_pipelining' from source: unknown 11762 1726853278.78015: variable 'ansible_timeout' from source: unknown 11762 1726853278.78022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853278.78118: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853278.78134: variable 'omit' from source: magic vars 11762 1726853278.78141: starting attempt loop 11762 1726853278.78151: running the handler 11762 1726853278.78187: _low_level_execute_command(): starting 11762 1726853278.78190: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853278.78960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853278.79006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853278.79019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.79103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.79125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.79199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.81194: stdout chunk (state=3): >>>/root <<< 11762 1726853278.81197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.81199: stderr chunk (state=3): >>><<< 11762 1726853278.81201: stdout chunk (state=3): >>><<< 11762 1726853278.81347: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.81375: _low_level_execute_command(): starting 11762 1726853278.81383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240 `" && echo ansible-tmp-1726853278.812649-12895-242630404221240="` echo /root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240 `" ) && sleep 0' 11762 1726853278.82155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853278.82254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.82264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.82288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.82300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.82404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.84642: stdout chunk (state=3): >>>ansible-tmp-1726853278.812649-12895-242630404221240=/root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240 <<< 11762 1726853278.84719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.84723: stdout chunk (state=3): >>><<< 11762 1726853278.84725: stderr chunk (state=3): >>><<< 11762 1726853278.84877: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853278.812649-12895-242630404221240=/root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.84881: variable 'ansible_module_compression' from source: unknown 11762 1726853278.84883: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853278.84886: variable 'ansible_facts' from source: unknown 11762 1726853278.85026: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240/AnsiballZ_command.py 11762 1726853278.85289: Sending initial data 11762 1726853278.85349: Sent initial data (155 bytes) 11762 1726853278.86073: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853278.86091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853278.86175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.86275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.86342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.88129: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853278.88199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853278.88262: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpzqr8d_ni /root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240/AnsiballZ_command.py <<< 11762 1726853278.88289: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240/AnsiballZ_command.py" <<< 11762 1726853278.88355: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpzqr8d_ni" to remote "/root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240/AnsiballZ_command.py" <<< 11762 1726853278.89611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.89615: stdout chunk (state=3): >>><<< 11762 1726853278.89618: stderr chunk (state=3): >>><<< 11762 1726853278.89692: done transferring module to remote 11762 1726853278.89698: _low_level_execute_command(): starting 11762 1726853278.89703: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240/ /root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240/AnsiballZ_command.py && sleep 0' 11762 1726853278.90519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853278.90581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853278.90584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853278.90587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853278.90589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853278.90591: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853278.90593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.90602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853278.90607: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853278.90614: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853278.90653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853278.90704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853278.90746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.90749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.90812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853278.92945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853278.92949: stdout chunk (state=3): >>><<< 11762 1726853278.92951: stderr chunk (state=3): >>><<< 11762 1726853278.92954: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853278.92956: _low_level_execute_command(): starting 11762 1726853278.92958: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240/AnsiballZ_command.py && sleep 0' 11762 1726853278.93998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853278.94006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853278.94016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853278.94036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853278.94137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853278.94140: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853278.94163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853278.94273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.10261: stdout chunk (state=3): >>> {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-20 13:27:59.098338", "end": "2024-09-20 13:27:59.101602", "delta": "0:00:00.003264", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853279.12079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853279.12084: stderr chunk (state=3): >>><<< 11762 1726853279.12086: stdout chunk (state=3): >>><<< 11762 1726853279.12088: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-20 13:27:59.098338", "end": "2024-09-20 13:27:59.101602", "delta": "0:00:00.003264", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853279.12091: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/num_grat_arp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853279.12093: _low_level_execute_command(): starting 11762 1726853279.12095: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853278.812649-12895-242630404221240/ > /dev/null 2>&1 && sleep 0' 11762 1726853279.12606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.12623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.12635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.12652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.12665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.12675: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853279.12684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.12698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853279.12706: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853279.12736: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.12801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853279.12828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.12925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.14903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.14907: stdout chunk (state=3): >>><<< 11762 1726853279.14912: stderr chunk (state=3): >>><<< 11762 1726853279.14943: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853279.14950: handler run complete 11762 1726853279.14974: Evaluated conditional (False): False 11762 1726853279.15143: variable 'bond_opt' from source: unknown 11762 1726853279.15146: variable 'result' from source: unknown 11762 1726853279.15162: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853279.15174: attempt loop complete, returning result 11762 1726853279.15202: variable 'bond_opt' from source: unknown 11762 1726853279.15268: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'num_grat_arp', 'value': '64'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "num_grat_arp", "value": "64" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/num_grat_arp" ], "delta": "0:00:00.003264", "end": "2024-09-20 13:27:59.101602", "rc": 0, "start": "2024-09-20 13:27:59.098338" } STDOUT: 64 11762 1726853279.15411: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853279.15415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853279.15418: variable 'omit' from source: magic vars 11762 1726853279.15676: variable 'ansible_distribution_major_version' from source: facts 11762 1726853279.15679: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853279.15681: variable 'omit' from source: magic vars 11762 1726853279.15684: variable 'omit' from source: magic vars 11762 1726853279.15876: variable 'controller_device' from source: play vars 11762 1726853279.15879: variable 'bond_opt' from source: unknown 11762 1726853279.15881: variable 'omit' from source: magic vars 11762 1726853279.15884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853279.15887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853279.15890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853279.15893: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853279.15896: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853279.15898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853279.15966: Set connection var ansible_timeout to 10 11762 1726853279.15970: Set connection var ansible_shell_type to sh 11762 1726853279.15977: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853279.15983: Set connection var ansible_shell_executable to /bin/sh 11762 1726853279.15990: Set connection var ansible_pipelining to False 11762 1726853279.15997: Set connection var ansible_connection to ssh 11762 1726853279.16023: variable 'ansible_shell_executable' from source: unknown 11762 1726853279.16027: variable 'ansible_connection' from source: unknown 11762 1726853279.16029: variable 'ansible_module_compression' from source: unknown 11762 1726853279.16031: variable 'ansible_shell_type' from source: unknown 11762 1726853279.16033: variable 'ansible_shell_executable' from source: unknown 11762 1726853279.16036: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853279.16040: variable 'ansible_pipelining' from source: unknown 11762 1726853279.16043: variable 'ansible_timeout' from source: unknown 11762 1726853279.16057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853279.16164: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853279.16174: variable 'omit' from source: magic vars 11762 1726853279.16179: starting attempt loop 11762 1726853279.16182: running the handler 11762 1726853279.16189: _low_level_execute_command(): starting 11762 1726853279.16193: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853279.16820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.16830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.16840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.16855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.16866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.16874: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853279.16884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.16979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853279.16982: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853279.16984: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853279.16986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.16988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.16994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.16996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.16998: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853279.17000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.17040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.17052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.17160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.19229: stdout chunk (state=3): >>>/root <<< 11762 1726853279.19246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.19253: stdout chunk (state=3): >>><<< 11762 1726853279.19262: stderr chunk (state=3): >>><<< 11762 1726853279.19282: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853279.19291: _low_level_execute_command(): starting 11762 1726853279.19336: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065 `" && echo ansible-tmp-1726853279.1928127-12895-64677021526065="` echo /root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065 `" ) && sleep 0' 11762 1726853279.20505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.20564: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.20578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.20594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.20612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.20616: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853279.20649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.20653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853279.20655: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853279.20664: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853279.20722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.20726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.20852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.20888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.21088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.23122: stdout chunk (state=3): >>>ansible-tmp-1726853279.1928127-12895-64677021526065=/root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065 <<< 11762 1726853279.23307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.23354: stderr chunk (state=3): >>><<< 11762 1726853279.23364: stdout chunk (state=3): >>><<< 11762 1726853279.23390: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853279.1928127-12895-64677021526065=/root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853279.23412: variable 'ansible_module_compression' from source: unknown 11762 1726853279.23453: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853279.23470: variable 'ansible_facts' from source: unknown 11762 1726853279.23577: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065/AnsiballZ_command.py 11762 1726853279.23719: Sending initial data 11762 1726853279.23722: Sent initial data (155 bytes) 11762 1726853279.24296: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.24305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.24317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.24331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.24374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.24378: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853279.24566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.24576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.24578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.26258: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11762 1726853279.26264: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853279.26364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853279.26407: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpxqv_s3hn /root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065/AnsiballZ_command.py <<< 11762 1726853279.26436: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065/AnsiballZ_command.py" <<< 11762 1726853279.26740: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpxqv_s3hn" to remote "/root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065/AnsiballZ_command.py" <<< 11762 1726853279.27810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.27814: stdout chunk (state=3): >>><<< 11762 1726853279.27821: stderr chunk (state=3): >>><<< 11762 1726853279.27845: done transferring module to remote 11762 1726853279.27852: _low_level_execute_command(): starting 11762 1726853279.27855: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065/ /root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065/AnsiballZ_command.py && sleep 0' 11762 1726853279.28478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.28482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.28484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.28487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.28489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.28491: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853279.28496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.28513: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853279.28518: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853279.28520: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853279.28529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.28537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.28548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.28555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.28561: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853279.28580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.28676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853279.28680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.28682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.28761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.30787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.30800: stderr chunk (state=3): >>><<< 11762 1726853279.30941: stdout chunk (state=3): >>><<< 11762 1726853279.30951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853279.30957: _low_level_execute_command(): starting 11762 1726853279.30960: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065/AnsiballZ_command.py && sleep 0' 11762 1726853279.31949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.31953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853279.31955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853279.31957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853279.31959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.32287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.32392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.32624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.48593: stdout chunk (state=3): >>> {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-20 13:27:59.481586", "end": "2024-09-20 13:27:59.484818", "delta": "0:00:00.003232", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853279.50551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853279.50555: stdout chunk (state=3): >>><<< 11762 1726853279.50558: stderr chunk (state=3): >>><<< 11762 1726853279.50561: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-20 13:27:59.481586", "end": "2024-09-20 13:27:59.484818", "delta": "0:00:00.003232", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853279.50563: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/resend_igmp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853279.50565: _low_level_execute_command(): starting 11762 1726853279.50568: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853279.1928127-12895-64677021526065/ > /dev/null 2>&1 && sleep 0' 11762 1726853279.51950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.52176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.52291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.52387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.54306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.54487: stderr chunk (state=3): >>><<< 11762 1726853279.54491: stdout chunk (state=3): >>><<< 11762 1726853279.54512: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853279.54515: handler run complete 11762 1726853279.54535: Evaluated conditional (False): False 11762 1726853279.54731: variable 'bond_opt' from source: unknown 11762 1726853279.54797: variable 'result' from source: unknown 11762 1726853279.54810: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853279.54822: attempt loop complete, returning result 11762 1726853279.54841: variable 'bond_opt' from source: unknown 11762 1726853279.55013: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'resend_igmp', 'value': '225'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "resend_igmp", "value": "225" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/resend_igmp" ], "delta": "0:00:00.003232", "end": "2024-09-20 13:27:59.484818", "rc": 0, "start": "2024-09-20 13:27:59.481586" } STDOUT: 225 11762 1726853279.55378: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853279.55381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853279.55384: variable 'omit' from source: magic vars 11762 1726853279.55657: variable 'ansible_distribution_major_version' from source: facts 11762 1726853279.55660: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853279.55662: variable 'omit' from source: magic vars 11762 1726853279.55664: variable 'omit' from source: magic vars 11762 1726853279.56009: variable 'controller_device' from source: play vars 11762 1726853279.56017: variable 'bond_opt' from source: unknown 11762 1726853279.56031: variable 'omit' from source: magic vars 11762 1726853279.56101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853279.56105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853279.56107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853279.56110: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853279.56112: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853279.56114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853279.56208: Set connection var ansible_timeout to 10 11762 1726853279.56211: Set connection var ansible_shell_type to sh 11762 1726853279.56213: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853279.56214: Set connection var ansible_shell_executable to /bin/sh 11762 1726853279.56395: Set connection var ansible_pipelining to False 11762 1726853279.56476: Set connection var ansible_connection to ssh 11762 1726853279.56482: variable 'ansible_shell_executable' from source: unknown 11762 1726853279.56487: variable 'ansible_connection' from source: unknown 11762 1726853279.56490: variable 'ansible_module_compression' from source: unknown 11762 1726853279.56492: variable 'ansible_shell_type' from source: unknown 11762 1726853279.56494: variable 'ansible_shell_executable' from source: unknown 11762 1726853279.56495: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853279.56497: variable 'ansible_pipelining' from source: unknown 11762 1726853279.56499: variable 'ansible_timeout' from source: unknown 11762 1726853279.56501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853279.56632: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853279.56641: variable 'omit' from source: magic vars 11762 1726853279.56646: starting attempt loop 11762 1726853279.56649: running the handler 11762 1726853279.56653: _low_level_execute_command(): starting 11762 1726853279.56657: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853279.58163: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.58289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.58683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.60134: stdout chunk (state=3): >>>/root <<< 11762 1726853279.60264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.60323: stderr chunk (state=3): >>><<< 11762 1726853279.60536: stdout chunk (state=3): >>><<< 11762 1726853279.60539: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853279.60545: _low_level_execute_command(): starting 11762 1726853279.60547: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281 `" && echo ansible-tmp-1726853279.6045616-12895-33754641666281="` echo /root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281 `" ) && sleep 0' 11762 1726853279.61389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.61445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.61449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.61451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.61485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.61518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853279.61539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.61553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.61803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.63845: stdout chunk (state=3): >>>ansible-tmp-1726853279.6045616-12895-33754641666281=/root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281 <<< 11762 1726853279.63986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.63990: stdout chunk (state=3): >>><<< 11762 1726853279.63996: stderr chunk (state=3): >>><<< 11762 1726853279.64015: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853279.6045616-12895-33754641666281=/root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853279.64059: variable 'ansible_module_compression' from source: unknown 11762 1726853279.64076: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853279.64168: variable 'ansible_facts' from source: unknown 11762 1726853279.64173: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281/AnsiballZ_command.py 11762 1726853279.64826: Sending initial data 11762 1726853279.64830: Sent initial data (155 bytes) 11762 1726853279.65776: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.65779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.65782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.65784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853279.65786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.66072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.66137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.67840: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853279.67910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853279.67984: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp5t8kq36n /root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281/AnsiballZ_command.py <<< 11762 1726853279.67988: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281/AnsiballZ_command.py" <<< 11762 1726853279.68036: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp5t8kq36n" to remote "/root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281/AnsiballZ_command.py" <<< 11762 1726853279.69460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.69588: stderr chunk (state=3): >>><<< 11762 1726853279.69591: stdout chunk (state=3): >>><<< 11762 1726853279.69594: done transferring module to remote 11762 1726853279.69596: _low_level_execute_command(): starting 11762 1726853279.69598: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281/ /root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281/AnsiballZ_command.py && sleep 0' 11762 1726853279.70792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.71091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853279.71103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.71156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853279.71227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.71486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.71887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.73682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.73712: stderr chunk (state=3): >>><<< 11762 1726853279.73722: stdout chunk (state=3): >>><<< 11762 1726853279.73747: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853279.73757: _low_level_execute_command(): starting 11762 1726853279.73767: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281/AnsiballZ_command.py && sleep 0' 11762 1726853279.74369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.74388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.74405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.74423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.74442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.74463: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853279.74487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.74504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853279.74513: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853279.74587: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.74616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853279.74631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.74652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.74753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.90551: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-20 13:27:59.901027", "end": "2024-09-20 13:27:59.904352", "delta": "0:00:00.003325", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853279.92318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853279.92336: stdout chunk (state=3): >>><<< 11762 1726853279.92353: stderr chunk (state=3): >>><<< 11762 1726853279.92379: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-20 13:27:59.901027", "end": "2024-09-20 13:27:59.904352", "delta": "0:00:00.003325", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853279.92414: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/updelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853279.92425: _low_level_execute_command(): starting 11762 1726853279.92446: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853279.6045616-12895-33754641666281/ > /dev/null 2>&1 && sleep 0' 11762 1726853279.93105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.93190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.93240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.93266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.93378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.95352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.95355: stdout chunk (state=3): >>><<< 11762 1726853279.95362: stderr chunk (state=3): >>><<< 11762 1726853279.95380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853279.95386: handler run complete 11762 1726853279.95407: Evaluated conditional (False): False 11762 1726853279.95560: variable 'bond_opt' from source: unknown 11762 1726853279.95678: variable 'result' from source: unknown 11762 1726853279.95685: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853279.95687: attempt loop complete, returning result 11762 1726853279.95689: variable 'bond_opt' from source: unknown 11762 1726853279.95692: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'updelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "updelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/updelay" ], "delta": "0:00:00.003325", "end": "2024-09-20 13:27:59.904352", "rc": 0, "start": "2024-09-20 13:27:59.901027" } STDOUT: 0 11762 1726853279.95994: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853279.95997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853279.96000: variable 'omit' from source: magic vars 11762 1726853279.96002: variable 'ansible_distribution_major_version' from source: facts 11762 1726853279.96004: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853279.96006: variable 'omit' from source: magic vars 11762 1726853279.96008: variable 'omit' from source: magic vars 11762 1726853279.96110: variable 'controller_device' from source: play vars 11762 1726853279.96114: variable 'bond_opt' from source: unknown 11762 1726853279.96131: variable 'omit' from source: magic vars 11762 1726853279.96151: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853279.96159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853279.96165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853279.96180: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853279.96183: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853279.96185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853279.96411: Set connection var ansible_timeout to 10 11762 1726853279.96414: Set connection var ansible_shell_type to sh 11762 1726853279.96417: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853279.96419: Set connection var ansible_shell_executable to /bin/sh 11762 1726853279.96421: Set connection var ansible_pipelining to False 11762 1726853279.96423: Set connection var ansible_connection to ssh 11762 1726853279.96425: variable 'ansible_shell_executable' from source: unknown 11762 1726853279.96427: variable 'ansible_connection' from source: unknown 11762 1726853279.96429: variable 'ansible_module_compression' from source: unknown 11762 1726853279.96431: variable 'ansible_shell_type' from source: unknown 11762 1726853279.96433: variable 'ansible_shell_executable' from source: unknown 11762 1726853279.96435: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853279.96437: variable 'ansible_pipelining' from source: unknown 11762 1726853279.96438: variable 'ansible_timeout' from source: unknown 11762 1726853279.96440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853279.96447: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853279.96449: variable 'omit' from source: magic vars 11762 1726853279.96451: starting attempt loop 11762 1726853279.96453: running the handler 11762 1726853279.96455: _low_level_execute_command(): starting 11762 1726853279.96457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853279.97076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.97079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.97082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.97086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.97092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.97100: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853279.97111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.97125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853279.97134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853279.97139: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853279.97147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.97157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.97168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.97178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.97191: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853279.97201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.97266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853279.97322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853279.97329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853279.97423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853279.99278: stdout chunk (state=3): >>>/root <<< 11762 1726853279.99282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853279.99285: stdout chunk (state=3): >>><<< 11762 1726853279.99287: stderr chunk (state=3): >>><<< 11762 1726853279.99289: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853279.99295: _low_level_execute_command(): starting 11762 1726853279.99323: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198 `" && echo ansible-tmp-1726853279.9928586-12895-39965653604198="` echo /root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198 `" ) && sleep 0' 11762 1726853279.99893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853279.99942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853279.99945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853279.99948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853279.99950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853279.99953: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853279.99964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853279.99977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853279.99980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853280.00060: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853280.00086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853280.00119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.00207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.02236: stdout chunk (state=3): >>>ansible-tmp-1726853279.9928586-12895-39965653604198=/root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198 <<< 11762 1726853280.02679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853280.02683: stdout chunk (state=3): >>><<< 11762 1726853280.02685: stderr chunk (state=3): >>><<< 11762 1726853280.02688: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853279.9928586-12895-39965653604198=/root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853280.02690: variable 'ansible_module_compression' from source: unknown 11762 1726853280.02692: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853280.02694: variable 'ansible_facts' from source: unknown 11762 1726853280.02760: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198/AnsiballZ_command.py 11762 1726853280.02892: Sending initial data 11762 1726853280.02924: Sent initial data (155 bytes) 11762 1726853280.03578: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853280.03689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853280.03714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853280.03732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.03836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.05514: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11762 1726853280.05539: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853280.05613: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853280.05696: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp11sebb7s /root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198/AnsiballZ_command.py <<< 11762 1726853280.05699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198/AnsiballZ_command.py" <<< 11762 1726853280.05760: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp11sebb7s" to remote "/root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198/AnsiballZ_command.py" <<< 11762 1726853280.06775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853280.06779: stderr chunk (state=3): >>><<< 11762 1726853280.06781: stdout chunk (state=3): >>><<< 11762 1726853280.06784: done transferring module to remote 11762 1726853280.06786: _low_level_execute_command(): starting 11762 1726853280.06788: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198/ /root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198/AnsiballZ_command.py && sleep 0' 11762 1726853280.07486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853280.07539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853280.07555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853280.07575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.07672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.09758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853280.09762: stdout chunk (state=3): >>><<< 11762 1726853280.09764: stderr chunk (state=3): >>><<< 11762 1726853280.09781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853280.09939: _low_level_execute_command(): starting 11762 1726853280.09945: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198/AnsiballZ_command.py && sleep 0' 11762 1726853280.10841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853280.10847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853280.10850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853280.10852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853280.10854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853280.10897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853280.10932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.11011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.27205: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-20 13:28:00.267197", "end": "2024-09-20 13:28:00.270370", "delta": "0:00:00.003173", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853280.29326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853280.29331: stdout chunk (state=3): >>><<< 11762 1726853280.29333: stderr chunk (state=3): >>><<< 11762 1726853280.29336: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-20 13:28:00.267197", "end": "2024-09-20 13:28:00.270370", "delta": "0:00:00.003173", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853280.29338: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/use_carrier', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853280.29340: _low_level_execute_command(): starting 11762 1726853280.29342: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853279.9928586-12895-39965653604198/ > /dev/null 2>&1 && sleep 0' 11762 1726853280.30987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853280.30997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853280.31000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.31007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.33107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853280.33166: stderr chunk (state=3): >>><<< 11762 1726853280.33243: stdout chunk (state=3): >>><<< 11762 1726853280.33258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853280.33263: handler run complete 11762 1726853280.33288: Evaluated conditional (False): False 11762 1726853280.33585: variable 'bond_opt' from source: unknown 11762 1726853280.34081: variable 'result' from source: unknown 11762 1726853280.34095: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853280.34107: attempt loop complete, returning result 11762 1726853280.34127: variable 'bond_opt' from source: unknown 11762 1726853280.34204: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'use_carrier', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "use_carrier", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/use_carrier" ], "delta": "0:00:00.003173", "end": "2024-09-20 13:28:00.270370", "rc": 0, "start": "2024-09-20 13:28:00.267197" } STDOUT: 1 11762 1726853280.34777: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853280.34781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853280.34783: variable 'omit' from source: magic vars 11762 1726853280.35277: variable 'ansible_distribution_major_version' from source: facts 11762 1726853280.35281: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853280.35283: variable 'omit' from source: magic vars 11762 1726853280.35285: variable 'omit' from source: magic vars 11762 1726853280.35543: variable 'controller_device' from source: play vars 11762 1726853280.35551: variable 'bond_opt' from source: unknown 11762 1726853280.35570: variable 'omit' from source: magic vars 11762 1726853280.35743: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853280.35747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853280.35750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853280.35752: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853280.35754: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853280.35756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853280.35848: Set connection var ansible_timeout to 10 11762 1726853280.35851: Set connection var ansible_shell_type to sh 11762 1726853280.35854: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853280.35856: Set connection var ansible_shell_executable to /bin/sh 11762 1726853280.35858: Set connection var ansible_pipelining to False 11762 1726853280.35860: Set connection var ansible_connection to ssh 11762 1726853280.36186: variable 'ansible_shell_executable' from source: unknown 11762 1726853280.36190: variable 'ansible_connection' from source: unknown 11762 1726853280.36192: variable 'ansible_module_compression' from source: unknown 11762 1726853280.36194: variable 'ansible_shell_type' from source: unknown 11762 1726853280.36195: variable 'ansible_shell_executable' from source: unknown 11762 1726853280.36197: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853280.36199: variable 'ansible_pipelining' from source: unknown 11762 1726853280.36201: variable 'ansible_timeout' from source: unknown 11762 1726853280.36203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853280.36436: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853280.36447: variable 'omit' from source: magic vars 11762 1726853280.36450: starting attempt loop 11762 1726853280.36452: running the handler 11762 1726853280.36456: _low_level_execute_command(): starting 11762 1726853280.36461: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853280.37941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853280.38031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853280.38329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.38537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.40165: stdout chunk (state=3): >>>/root <<< 11762 1726853280.40331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853280.40389: stderr chunk (state=3): >>><<< 11762 1726853280.40393: stdout chunk (state=3): >>><<< 11762 1726853280.40417: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853280.40425: _low_level_execute_command(): starting 11762 1726853280.40428: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606 `" && echo ansible-tmp-1726853280.4041007-12895-235038678089606="` echo /root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606 `" ) && sleep 0' 11762 1726853280.41907: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853280.42105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853280.42190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.42335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.44481: stdout chunk (state=3): >>>ansible-tmp-1726853280.4041007-12895-235038678089606=/root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606 <<< 11762 1726853280.44485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853280.44588: stdout chunk (state=3): >>><<< 11762 1726853280.44592: stderr chunk (state=3): >>><<< 11762 1726853280.44594: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853280.4041007-12895-235038678089606=/root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853280.44597: variable 'ansible_module_compression' from source: unknown 11762 1726853280.44599: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853280.44600: variable 'ansible_facts' from source: unknown 11762 1726853280.44664: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606/AnsiballZ_command.py 11762 1726853280.44982: Sending initial data 11762 1726853280.44986: Sent initial data (156 bytes) 11762 1726853280.45979: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853280.45983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853280.45985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853280.45987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853280.45989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853280.46035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853280.46057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853280.46070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.46304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.47830: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853280.47958: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853280.48031: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpi8dq2hn5 /root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606/AnsiballZ_command.py <<< 11762 1726853280.48034: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606/AnsiballZ_command.py" <<< 11762 1726853280.48282: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpi8dq2hn5" to remote "/root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606/AnsiballZ_command.py" <<< 11762 1726853280.49959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853280.50086: stderr chunk (state=3): >>><<< 11762 1726853280.50090: stdout chunk (state=3): >>><<< 11762 1726853280.50113: done transferring module to remote 11762 1726853280.50121: _low_level_execute_command(): starting 11762 1726853280.50127: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606/ /root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606/AnsiballZ_command.py && sleep 0' 11762 1726853280.51686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853280.51891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853280.52010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853280.52014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853280.52016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853280.52018: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853280.52020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853280.52085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853280.52310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.52382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.54283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853280.54338: stderr chunk (state=3): >>><<< 11762 1726853280.54593: stdout chunk (state=3): >>><<< 11762 1726853280.54614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853280.54621: _low_level_execute_command(): starting 11762 1726853280.54623: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606/AnsiballZ_command.py && sleep 0' 11762 1726853280.55779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853280.55929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853280.56200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853280.56204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.56207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.72332: stdout chunk (state=3): >>> {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-20 13:28:00.719141", "end": "2024-09-20 13:28:00.722248", "delta": "0:00:00.003107", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853280.73963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853280.73967: stdout chunk (state=3): >>><<< 11762 1726853280.73974: stderr chunk (state=3): >>><<< 11762 1726853280.74007: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-20 13:28:00.719141", "end": "2024-09-20 13:28:00.722248", "delta": "0:00:00.003107", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853280.74257: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/xmit_hash_policy', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853280.74265: _low_level_execute_command(): starting 11762 1726853280.74268: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853280.4041007-12895-235038678089606/ > /dev/null 2>&1 && sleep 0' 11762 1726853280.75781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853280.75980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853280.75997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853280.76009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853280.76109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853280.78019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853280.78077: stderr chunk (state=3): >>><<< 11762 1726853280.78207: stdout chunk (state=3): >>><<< 11762 1726853280.78227: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853280.78231: handler run complete 11762 1726853280.78254: Evaluated conditional (False): False 11762 1726853280.78613: variable 'bond_opt' from source: unknown 11762 1726853280.78619: variable 'result' from source: unknown 11762 1726853280.78633: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853280.78646: attempt loop complete, returning result 11762 1726853280.78663: variable 'bond_opt' from source: unknown 11762 1726853280.78845: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'xmit_hash_policy', 'value': 'encap2+3'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "xmit_hash_policy", "value": "encap2+3" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy" ], "delta": "0:00:00.003107", "end": "2024-09-20 13:28:00.722248", "rc": 0, "start": "2024-09-20 13:28:00.719141" } STDOUT: encap2+3 3 11762 1726853280.79022: dumping result to json 11762 1726853280.79025: done dumping result, returning 11762 1726853280.79027: done running TaskExecutor() for managed_node2/TASK: ** TEST check bond settings [02083763-bbaf-d845-03d0-000000000400] 11762 1726853280.79029: sending task result for task 02083763-bbaf-d845-03d0-000000000400 11762 1726853280.81079: done sending task result for task 02083763-bbaf-d845-03d0-000000000400 11762 1726853280.81085: WORKER PROCESS EXITING 11762 1726853280.81127: no more pending results, returning what we have 11762 1726853280.81131: results queue empty 11762 1726853280.81132: checking for any_errors_fatal 11762 1726853280.81137: done checking for any_errors_fatal 11762 1726853280.81138: checking for max_fail_percentage 11762 1726853280.81140: done checking for max_fail_percentage 11762 1726853280.81141: checking to see if all hosts have failed and the running result is not ok 11762 1726853280.81141: done checking to see if all hosts have failed 11762 1726853280.81145: getting the remaining hosts for this loop 11762 1726853280.81146: done getting the remaining hosts for this loop 11762 1726853280.81150: getting the next task for host managed_node2 11762 1726853280.81157: done getting next task for host managed_node2 11762 1726853280.81160: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 11762 1726853280.81164: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853280.81168: getting variables 11762 1726853280.81169: in VariableManager get_vars() 11762 1726853280.81197: Calling all_inventory to load vars for managed_node2 11762 1726853280.81200: Calling groups_inventory to load vars for managed_node2 11762 1726853280.81203: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853280.81213: Calling all_plugins_play to load vars for managed_node2 11762 1726853280.81216: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853280.81219: Calling groups_plugins_play to load vars for managed_node2 11762 1726853280.85944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853280.90700: done with get_vars() 11762 1726853280.90735: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Friday 20 September 2024 13:28:00 -0400 (0:00:06.312) 0:00:31.342 ****** 11762 1726853280.91245: entering _queue_task() for managed_node2/include_tasks 11762 1726853280.92408: worker is 1 (out of 1 available) 11762 1726853280.92418: exiting _queue_task() for managed_node2/include_tasks 11762 1726853280.92430: done queuing things up, now waiting for results queue to drain 11762 1726853280.92432: waiting for pending results... 11762 1726853280.92545: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv4_present.yml' 11762 1726853280.93016: in run() - task 02083763-bbaf-d845-03d0-000000000402 11762 1726853280.93022: variable 'ansible_search_path' from source: unknown 11762 1726853280.93025: variable 'ansible_search_path' from source: unknown 11762 1726853280.93028: calling self._execute() 11762 1726853280.93236: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853280.93249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853280.93263: variable 'omit' from source: magic vars 11762 1726853280.94780: variable 'ansible_distribution_major_version' from source: facts 11762 1726853280.94785: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853280.94788: _execute() done 11762 1726853280.94791: dumping result to json 11762 1726853280.94793: done dumping result, returning 11762 1726853280.94795: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv4_present.yml' [02083763-bbaf-d845-03d0-000000000402] 11762 1726853280.94797: sending task result for task 02083763-bbaf-d845-03d0-000000000402 11762 1726853280.94909: no more pending results, returning what we have 11762 1726853280.94915: in VariableManager get_vars() 11762 1726853280.94963: Calling all_inventory to load vars for managed_node2 11762 1726853280.94967: Calling groups_inventory to load vars for managed_node2 11762 1726853280.94972: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853280.94986: Calling all_plugins_play to load vars for managed_node2 11762 1726853280.94989: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853280.94991: Calling groups_plugins_play to load vars for managed_node2 11762 1726853280.95881: done sending task result for task 02083763-bbaf-d845-03d0-000000000402 11762 1726853280.95887: WORKER PROCESS EXITING 11762 1726853280.99546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853281.04263: done with get_vars() 11762 1726853281.04294: variable 'ansible_search_path' from source: unknown 11762 1726853281.04295: variable 'ansible_search_path' from source: unknown 11762 1726853281.04305: variable 'item' from source: include params 11762 1726853281.04816: variable 'item' from source: include params 11762 1726853281.04856: we have included files to process 11762 1726853281.04858: generating all_blocks data 11762 1726853281.04860: done generating all_blocks data 11762 1726853281.04865: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11762 1726853281.04867: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11762 1726853281.04869: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11762 1726853281.05536: done processing included file 11762 1726853281.05539: iterating over new_blocks loaded from include file 11762 1726853281.05540: in VariableManager get_vars() 11762 1726853281.05560: done with get_vars() 11762 1726853281.05563: filtering new block on tags 11762 1726853281.05999: done filtering new block on tags 11762 1726853281.06002: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed_node2 11762 1726853281.06009: extending task lists for all hosts with included blocks 11762 1726853281.06620: done extending task lists 11762 1726853281.06622: done processing included files 11762 1726853281.06622: results queue empty 11762 1726853281.06623: checking for any_errors_fatal 11762 1726853281.06637: done checking for any_errors_fatal 11762 1726853281.06638: checking for max_fail_percentage 11762 1726853281.06639: done checking for max_fail_percentage 11762 1726853281.06640: checking to see if all hosts have failed and the running result is not ok 11762 1726853281.06640: done checking to see if all hosts have failed 11762 1726853281.06641: getting the remaining hosts for this loop 11762 1726853281.06645: done getting the remaining hosts for this loop 11762 1726853281.06647: getting the next task for host managed_node2 11762 1726853281.06652: done getting next task for host managed_node2 11762 1726853281.06654: ^ task is: TASK: ** TEST check IPv4 11762 1726853281.06658: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853281.06660: getting variables 11762 1726853281.06661: in VariableManager get_vars() 11762 1726853281.06674: Calling all_inventory to load vars for managed_node2 11762 1726853281.06676: Calling groups_inventory to load vars for managed_node2 11762 1726853281.06679: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853281.06685: Calling all_plugins_play to load vars for managed_node2 11762 1726853281.06687: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853281.06690: Calling groups_plugins_play to load vars for managed_node2 11762 1726853281.10794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853281.15629: done with get_vars() 11762 1726853281.15664: done getting variables 11762 1726853281.15712: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Friday 20 September 2024 13:28:01 -0400 (0:00:00.244) 0:00:31.587 ****** 11762 1726853281.15745: entering _queue_task() for managed_node2/command 11762 1726853281.16929: worker is 1 (out of 1 available) 11762 1726853281.16941: exiting _queue_task() for managed_node2/command 11762 1726853281.16958: done queuing things up, now waiting for results queue to drain 11762 1726853281.16960: waiting for pending results... 11762 1726853281.17843: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 11762 1726853281.18313: in run() - task 02083763-bbaf-d845-03d0-000000000631 11762 1726853281.18338: variable 'ansible_search_path' from source: unknown 11762 1726853281.18346: variable 'ansible_search_path' from source: unknown 11762 1726853281.18389: calling self._execute() 11762 1726853281.18487: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853281.18587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853281.18704: variable 'omit' from source: magic vars 11762 1726853281.19467: variable 'ansible_distribution_major_version' from source: facts 11762 1726853281.19490: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853281.19503: variable 'omit' from source: magic vars 11762 1726853281.19565: variable 'omit' from source: magic vars 11762 1726853281.19899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853281.22320: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853281.22416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853281.22444: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853281.22485: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853281.22632: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853281.22635: variable 'interface' from source: include params 11762 1726853281.22637: variable 'controller_device' from source: play vars 11762 1726853281.22702: variable 'controller_device' from source: play vars 11762 1726853281.22731: variable 'omit' from source: magic vars 11762 1726853281.22776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853281.22808: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853281.22831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853281.22861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853281.22879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853281.22913: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853281.22922: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853281.22930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853281.23036: Set connection var ansible_timeout to 10 11762 1726853281.23045: Set connection var ansible_shell_type to sh 11762 1726853281.23056: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853281.23077: Set connection var ansible_shell_executable to /bin/sh 11762 1726853281.23092: Set connection var ansible_pipelining to False 11762 1726853281.23102: Set connection var ansible_connection to ssh 11762 1726853281.23129: variable 'ansible_shell_executable' from source: unknown 11762 1726853281.23138: variable 'ansible_connection' from source: unknown 11762 1726853281.23145: variable 'ansible_module_compression' from source: unknown 11762 1726853281.23152: variable 'ansible_shell_type' from source: unknown 11762 1726853281.23177: variable 'ansible_shell_executable' from source: unknown 11762 1726853281.23180: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853281.23182: variable 'ansible_pipelining' from source: unknown 11762 1726853281.23187: variable 'ansible_timeout' from source: unknown 11762 1726853281.23276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853281.23393: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853281.23396: variable 'omit' from source: magic vars 11762 1726853281.23402: starting attempt loop 11762 1726853281.23406: running the handler 11762 1726853281.23408: _low_level_execute_command(): starting 11762 1726853281.23410: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853281.24174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853281.24228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853281.24246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853281.24787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853281.26491: stdout chunk (state=3): >>>/root <<< 11762 1726853281.26723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853281.26819: stderr chunk (state=3): >>><<< 11762 1726853281.26823: stdout chunk (state=3): >>><<< 11762 1726853281.26984: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853281.26988: _low_level_execute_command(): starting 11762 1726853281.26991: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166 `" && echo ansible-tmp-1726853281.2689342-13306-76257775579166="` echo /root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166 `" ) && sleep 0' 11762 1726853281.28375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853281.28457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853281.28477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853281.28588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853281.28640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853281.28723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853281.28726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853281.29061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853281.30988: stdout chunk (state=3): >>>ansible-tmp-1726853281.2689342-13306-76257775579166=/root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166 <<< 11762 1726853281.31010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853281.31057: stderr chunk (state=3): >>><<< 11762 1726853281.31061: stdout chunk (state=3): >>><<< 11762 1726853281.31083: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853281.2689342-13306-76257775579166=/root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853281.31114: variable 'ansible_module_compression' from source: unknown 11762 1726853281.31202: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853281.31205: variable 'ansible_facts' from source: unknown 11762 1726853281.31481: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166/AnsiballZ_command.py 11762 1726853281.31964: Sending initial data 11762 1726853281.31967: Sent initial data (155 bytes) 11762 1726853281.33094: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853281.33114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853281.33128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853281.33462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853281.35041: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853281.35099: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853281.35185: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpnyltm_4_ /root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166/AnsiballZ_command.py <<< 11762 1726853281.35189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166/AnsiballZ_command.py" <<< 11762 1726853281.35275: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpnyltm_4_" to remote "/root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166/AnsiballZ_command.py" <<< 11762 1726853281.36776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853281.36787: stdout chunk (state=3): >>><<< 11762 1726853281.36792: stderr chunk (state=3): >>><<< 11762 1726853281.36853: done transferring module to remote 11762 1726853281.36864: _low_level_execute_command(): starting 11762 1726853281.36869: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166/ /root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166/AnsiballZ_command.py && sleep 0' 11762 1726853281.38120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853281.38129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853281.38143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853281.38156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853281.38167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853281.38177: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853281.38185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853281.38198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853281.38206: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853281.38212: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853281.38220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853281.38229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853281.38241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853281.38250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853281.38258: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853281.38267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853281.38483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853281.38554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853281.40740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853281.40745: stdout chunk (state=3): >>><<< 11762 1726853281.40752: stderr chunk (state=3): >>><<< 11762 1726853281.40769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853281.40774: _low_level_execute_command(): starting 11762 1726853281.40780: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166/AnsiballZ_command.py && sleep 0' 11762 1726853281.42382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853281.42386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853281.42388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853281.42391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853281.42597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853281.59610: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.73/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 228sec preferred_lft 228sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:01.589637", "end": "2024-09-20 13:28:01.593534", "delta": "0:00:00.003897", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853281.61252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853281.61256: stdout chunk (state=3): >>><<< 11762 1726853281.61264: stderr chunk (state=3): >>><<< 11762 1726853281.61285: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.73/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 228sec preferred_lft 228sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:01.589637", "end": "2024-09-20 13:28:01.593534", "delta": "0:00:00.003897", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853281.61324: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853281.61333: _low_level_execute_command(): starting 11762 1726853281.61336: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853281.2689342-13306-76257775579166/ > /dev/null 2>&1 && sleep 0' 11762 1726853281.62613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853281.62684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853281.62976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853281.62995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853281.64864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853281.64910: stderr chunk (state=3): >>><<< 11762 1726853281.64913: stdout chunk (state=3): >>><<< 11762 1726853281.64935: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853281.64943: handler run complete 11762 1726853281.64969: Evaluated conditional (False): False 11762 1726853281.65356: variable 'address' from source: include params 11762 1726853281.65359: variable 'result' from source: set_fact 11762 1726853281.65379: Evaluated conditional (address in result.stdout): True 11762 1726853281.65391: attempt loop complete, returning result 11762 1726853281.65394: _execute() done 11762 1726853281.65396: dumping result to json 11762 1726853281.65401: done dumping result, returning 11762 1726853281.65409: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 [02083763-bbaf-d845-03d0-000000000631] 11762 1726853281.65414: sending task result for task 02083763-bbaf-d845-03d0-000000000631 ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003897", "end": "2024-09-20 13:28:01.593534", "rc": 0, "start": "2024-09-20 13:28:01.589637" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.73/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 228sec preferred_lft 228sec 11762 1726853281.65664: no more pending results, returning what we have 11762 1726853281.65669: results queue empty 11762 1726853281.65670: checking for any_errors_fatal 11762 1726853281.65673: done checking for any_errors_fatal 11762 1726853281.65673: checking for max_fail_percentage 11762 1726853281.65676: done checking for max_fail_percentage 11762 1726853281.65677: checking to see if all hosts have failed and the running result is not ok 11762 1726853281.65677: done checking to see if all hosts have failed 11762 1726853281.65678: getting the remaining hosts for this loop 11762 1726853281.65680: done getting the remaining hosts for this loop 11762 1726853281.65683: getting the next task for host managed_node2 11762 1726853281.65691: done getting next task for host managed_node2 11762 1726853281.65693: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 11762 1726853281.65697: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853281.65701: getting variables 11762 1726853281.65702: in VariableManager get_vars() 11762 1726853281.65734: Calling all_inventory to load vars for managed_node2 11762 1726853281.65737: Calling groups_inventory to load vars for managed_node2 11762 1726853281.65740: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853281.65750: Calling all_plugins_play to load vars for managed_node2 11762 1726853281.65753: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853281.65755: Calling groups_plugins_play to load vars for managed_node2 11762 1726853281.66432: done sending task result for task 02083763-bbaf-d845-03d0-000000000631 11762 1726853281.66435: WORKER PROCESS EXITING 11762 1726853281.68853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853281.72081: done with get_vars() 11762 1726853281.72115: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Friday 20 September 2024 13:28:01 -0400 (0:00:00.565) 0:00:32.153 ****** 11762 1726853281.72330: entering _queue_task() for managed_node2/include_tasks 11762 1726853281.73161: worker is 1 (out of 1 available) 11762 1726853281.73177: exiting _queue_task() for managed_node2/include_tasks 11762 1726853281.73190: done queuing things up, now waiting for results queue to drain 11762 1726853281.73192: waiting for pending results... 11762 1726853281.73891: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv6_present.yml' 11762 1726853281.73991: in run() - task 02083763-bbaf-d845-03d0-000000000403 11762 1726853281.74011: variable 'ansible_search_path' from source: unknown 11762 1726853281.74017: variable 'ansible_search_path' from source: unknown 11762 1726853281.74055: calling self._execute() 11762 1726853281.74276: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853281.74280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853281.74283: variable 'omit' from source: magic vars 11762 1726853281.75037: variable 'ansible_distribution_major_version' from source: facts 11762 1726853281.75090: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853281.75186: _execute() done 11762 1726853281.75195: dumping result to json 11762 1726853281.75203: done dumping result, returning 11762 1726853281.75213: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv6_present.yml' [02083763-bbaf-d845-03d0-000000000403] 11762 1726853281.75223: sending task result for task 02083763-bbaf-d845-03d0-000000000403 11762 1726853281.75361: no more pending results, returning what we have 11762 1726853281.75367: in VariableManager get_vars() 11762 1726853281.75408: Calling all_inventory to load vars for managed_node2 11762 1726853281.75411: Calling groups_inventory to load vars for managed_node2 11762 1726853281.75414: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853281.75428: Calling all_plugins_play to load vars for managed_node2 11762 1726853281.75432: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853281.75434: Calling groups_plugins_play to load vars for managed_node2 11762 1726853281.76579: done sending task result for task 02083763-bbaf-d845-03d0-000000000403 11762 1726853281.76583: WORKER PROCESS EXITING 11762 1726853281.78501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853281.81679: done with get_vars() 11762 1726853281.81705: variable 'ansible_search_path' from source: unknown 11762 1726853281.81707: variable 'ansible_search_path' from source: unknown 11762 1726853281.81717: variable 'item' from source: include params 11762 1726853281.81934: variable 'item' from source: include params 11762 1726853281.81968: we have included files to process 11762 1726853281.81969: generating all_blocks data 11762 1726853281.82084: done generating all_blocks data 11762 1726853281.82091: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11762 1726853281.82093: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11762 1726853281.82096: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11762 1726853281.82565: done processing included file 11762 1726853281.82566: iterating over new_blocks loaded from include file 11762 1726853281.82568: in VariableManager get_vars() 11762 1726853281.82587: done with get_vars() 11762 1726853281.82588: filtering new block on tags 11762 1726853281.82614: done filtering new block on tags 11762 1726853281.82617: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed_node2 11762 1726853281.82675: extending task lists for all hosts with included blocks 11762 1726853281.83485: done extending task lists 11762 1726853281.83487: done processing included files 11762 1726853281.83488: results queue empty 11762 1726853281.83490: checking for any_errors_fatal 11762 1726853281.83495: done checking for any_errors_fatal 11762 1726853281.83496: checking for max_fail_percentage 11762 1726853281.83497: done checking for max_fail_percentage 11762 1726853281.83498: checking to see if all hosts have failed and the running result is not ok 11762 1726853281.83499: done checking to see if all hosts have failed 11762 1726853281.83499: getting the remaining hosts for this loop 11762 1726853281.83501: done getting the remaining hosts for this loop 11762 1726853281.83503: getting the next task for host managed_node2 11762 1726853281.83507: done getting next task for host managed_node2 11762 1726853281.83509: ^ task is: TASK: ** TEST check IPv6 11762 1726853281.83513: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853281.83515: getting variables 11762 1726853281.83516: in VariableManager get_vars() 11762 1726853281.83526: Calling all_inventory to load vars for managed_node2 11762 1726853281.83528: Calling groups_inventory to load vars for managed_node2 11762 1726853281.83530: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853281.83536: Calling all_plugins_play to load vars for managed_node2 11762 1726853281.83539: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853281.83542: Calling groups_plugins_play to load vars for managed_node2 11762 1726853281.86431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853281.89550: done with get_vars() 11762 1726853281.89582: done getting variables 11762 1726853281.89631: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Friday 20 September 2024 13:28:01 -0400 (0:00:00.173) 0:00:32.327 ****** 11762 1726853281.89664: entering _queue_task() for managed_node2/command 11762 1726853281.90417: worker is 1 (out of 1 available) 11762 1726853281.90431: exiting _queue_task() for managed_node2/command 11762 1726853281.90442: done queuing things up, now waiting for results queue to drain 11762 1726853281.90444: waiting for pending results... 11762 1726853281.90927: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 11762 1726853281.91212: in run() - task 02083763-bbaf-d845-03d0-000000000652 11762 1726853281.91230: variable 'ansible_search_path' from source: unknown 11762 1726853281.91420: variable 'ansible_search_path' from source: unknown 11762 1726853281.91424: calling self._execute() 11762 1726853281.91552: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853281.91564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853281.91580: variable 'omit' from source: magic vars 11762 1726853281.92292: variable 'ansible_distribution_major_version' from source: facts 11762 1726853281.92391: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853281.92403: variable 'omit' from source: magic vars 11762 1726853281.92465: variable 'omit' from source: magic vars 11762 1726853281.92795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853281.98430: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853281.98607: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853281.98863: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853281.98867: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853281.98870: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853281.99012: variable 'controller_device' from source: play vars 11762 1726853281.99042: variable 'omit' from source: magic vars 11762 1726853281.99294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853281.99298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853281.99300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853281.99303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853281.99305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853281.99307: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853281.99407: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853281.99416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853281.99590: Set connection var ansible_timeout to 10 11762 1726853281.99600: Set connection var ansible_shell_type to sh 11762 1726853281.99612: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853281.99730: Set connection var ansible_shell_executable to /bin/sh 11762 1726853281.99733: Set connection var ansible_pipelining to False 11762 1726853281.99736: Set connection var ansible_connection to ssh 11762 1726853281.99738: variable 'ansible_shell_executable' from source: unknown 11762 1726853281.99740: variable 'ansible_connection' from source: unknown 11762 1726853281.99742: variable 'ansible_module_compression' from source: unknown 11762 1726853281.99947: variable 'ansible_shell_type' from source: unknown 11762 1726853281.99951: variable 'ansible_shell_executable' from source: unknown 11762 1726853281.99953: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853281.99955: variable 'ansible_pipelining' from source: unknown 11762 1726853281.99957: variable 'ansible_timeout' from source: unknown 11762 1726853281.99959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853282.00165: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853282.00168: variable 'omit' from source: magic vars 11762 1726853282.00172: starting attempt loop 11762 1726853282.00174: running the handler 11762 1726853282.00176: _low_level_execute_command(): starting 11762 1726853282.00178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853282.01700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853282.01798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853282.01905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853282.03682: stdout chunk (state=3): >>>/root <<< 11762 1726853282.03835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853282.03838: stdout chunk (state=3): >>><<< 11762 1726853282.03840: stderr chunk (state=3): >>><<< 11762 1726853282.04174: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853282.04180: _low_level_execute_command(): starting 11762 1726853282.04184: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491 `" && echo ansible-tmp-1726853282.0408618-13323-113216529638491="` echo /root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491 `" ) && sleep 0' 11762 1726853282.05335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853282.05339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853282.05342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853282.05345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853282.05347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853282.05349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853282.05520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853282.05523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853282.05669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853282.07920: stdout chunk (state=3): >>>ansible-tmp-1726853282.0408618-13323-113216529638491=/root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491 <<< 11762 1726853282.07923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853282.07932: stderr chunk (state=3): >>><<< 11762 1726853282.07941: stdout chunk (state=3): >>><<< 11762 1726853282.08121: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853282.0408618-13323-113216529638491=/root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853282.08124: variable 'ansible_module_compression' from source: unknown 11762 1726853282.08127: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853282.08361: variable 'ansible_facts' from source: unknown 11762 1726853282.08614: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491/AnsiballZ_command.py 11762 1726853282.09203: Sending initial data 11762 1726853282.09207: Sent initial data (156 bytes) 11762 1726853282.10531: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853282.10690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853282.10724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853282.10834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853282.10986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853282.11200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853282.13019: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853282.13041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853282.13211: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp2vxdu8pj /root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491/AnsiballZ_command.py <<< 11762 1726853282.13215: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491/AnsiballZ_command.py" <<< 11762 1726853282.13281: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp2vxdu8pj" to remote "/root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491/AnsiballZ_command.py" <<< 11762 1726853282.15553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853282.15558: stdout chunk (state=3): >>><<< 11762 1726853282.15561: stderr chunk (state=3): >>><<< 11762 1726853282.15562: done transferring module to remote 11762 1726853282.15564: _low_level_execute_command(): starting 11762 1726853282.15566: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491/ /root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491/AnsiballZ_command.py && sleep 0' 11762 1726853282.16651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853282.16655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853282.16662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853282.16664: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853282.16883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853282.16886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853282.16993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853282.17068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853282.19114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853282.19118: stdout chunk (state=3): >>><<< 11762 1726853282.19120: stderr chunk (state=3): >>><<< 11762 1726853282.19123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853282.19131: _low_level_execute_command(): starting 11762 1726853282.19145: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491/AnsiballZ_command.py && sleep 0' 11762 1726853282.20617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853282.20759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853282.20780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853282.20885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853282.36906: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::c4/128 scope global dynamic noprefixroute \n valid_lft 227sec preferred_lft 227sec\n inet6 2001:db8::a43b:3ff:fedf:a83/64 scope global dynamic noprefixroute \n valid_lft 1791sec preferred_lft 1791sec\n inet6 fe80::a43b:3ff:fedf:a83/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:02.363536", "end": "2024-09-20 13:28:02.367342", "delta": "0:00:00.003806", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853282.38553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853282.38706: stdout chunk (state=3): >>><<< 11762 1726853282.38710: stderr chunk (state=3): >>><<< 11762 1726853282.38928: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::c4/128 scope global dynamic noprefixroute \n valid_lft 227sec preferred_lft 227sec\n inet6 2001:db8::a43b:3ff:fedf:a83/64 scope global dynamic noprefixroute \n valid_lft 1791sec preferred_lft 1791sec\n inet6 fe80::a43b:3ff:fedf:a83/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:02.363536", "end": "2024-09-20 13:28:02.367342", "delta": "0:00:00.003806", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853282.38934: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853282.38944: _low_level_execute_command(): starting 11762 1726853282.38947: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853282.0408618-13323-113216529638491/ > /dev/null 2>&1 && sleep 0' 11762 1726853282.40916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853282.40920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853282.40923: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853282.41238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853282.41578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853282.41683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853282.43765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853282.43769: stdout chunk (state=3): >>><<< 11762 1726853282.43776: stderr chunk (state=3): >>><<< 11762 1726853282.43794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853282.43799: handler run complete 11762 1726853282.43824: Evaluated conditional (False): False 11762 1726853282.44590: variable 'address' from source: include params 11762 1726853282.44594: variable 'result' from source: set_fact 11762 1726853282.44612: Evaluated conditional (address in result.stdout): True 11762 1726853282.44624: attempt loop complete, returning result 11762 1726853282.44627: _execute() done 11762 1726853282.44630: dumping result to json 11762 1726853282.44635: done dumping result, returning 11762 1726853282.44648: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 [02083763-bbaf-d845-03d0-000000000652] 11762 1726853282.44650: sending task result for task 02083763-bbaf-d845-03d0-000000000652 11762 1726853282.44830: done sending task result for task 02083763-bbaf-d845-03d0-000000000652 11762 1726853282.44833: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003806", "end": "2024-09-20 13:28:02.367342", "rc": 0, "start": "2024-09-20 13:28:02.363536" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::c4/128 scope global dynamic noprefixroute valid_lft 227sec preferred_lft 227sec inet6 2001:db8::a43b:3ff:fedf:a83/64 scope global dynamic noprefixroute valid_lft 1791sec preferred_lft 1791sec inet6 fe80::a43b:3ff:fedf:a83/64 scope link noprefixroute valid_lft forever preferred_lft forever 11762 1726853282.44937: no more pending results, returning what we have 11762 1726853282.44952: results queue empty 11762 1726853282.44953: checking for any_errors_fatal 11762 1726853282.44955: done checking for any_errors_fatal 11762 1726853282.44956: checking for max_fail_percentage 11762 1726853282.44958: done checking for max_fail_percentage 11762 1726853282.44959: checking to see if all hosts have failed and the running result is not ok 11762 1726853282.44960: done checking to see if all hosts have failed 11762 1726853282.44961: getting the remaining hosts for this loop 11762 1726853282.44963: done getting the remaining hosts for this loop 11762 1726853282.44967: getting the next task for host managed_node2 11762 1726853282.44979: done getting next task for host managed_node2 11762 1726853282.44982: ^ task is: TASK: Conditional asserts 11762 1726853282.44984: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853282.44989: getting variables 11762 1726853282.44990: in VariableManager get_vars() 11762 1726853282.45023: Calling all_inventory to load vars for managed_node2 11762 1726853282.45025: Calling groups_inventory to load vars for managed_node2 11762 1726853282.45028: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853282.45038: Calling all_plugins_play to load vars for managed_node2 11762 1726853282.45041: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853282.45045: Calling groups_plugins_play to load vars for managed_node2 11762 1726853282.48701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853282.52534: done with get_vars() 11762 1726853282.52685: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 13:28:02 -0400 (0:00:00.632) 0:00:32.959 ****** 11762 1726853282.52910: entering _queue_task() for managed_node2/include_tasks 11762 1726853282.53697: worker is 1 (out of 1 available) 11762 1726853282.53710: exiting _queue_task() for managed_node2/include_tasks 11762 1726853282.53948: done queuing things up, now waiting for results queue to drain 11762 1726853282.53950: waiting for pending results... 11762 1726853282.54441: running TaskExecutor() for managed_node2/TASK: Conditional asserts 11762 1726853282.54978: in run() - task 02083763-bbaf-d845-03d0-00000000008e 11762 1726853282.54982: variable 'ansible_search_path' from source: unknown 11762 1726853282.54985: variable 'ansible_search_path' from source: unknown 11762 1726853282.55659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853282.59915: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853282.60016: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853282.60064: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853282.60115: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853282.60148: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853282.60253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853282.60304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853282.60336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853282.60403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853282.60476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853282.60602: dumping result to json 11762 1726853282.60622: done dumping result, returning 11762 1726853282.60634: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [02083763-bbaf-d845-03d0-00000000008e] 11762 1726853282.60648: sending task result for task 02083763-bbaf-d845-03d0-00000000008e 11762 1726853282.60798: done sending task result for task 02083763-bbaf-d845-03d0-00000000008e 11762 1726853282.60801: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 11762 1726853282.60868: no more pending results, returning what we have 11762 1726853282.60874: results queue empty 11762 1726853282.60875: checking for any_errors_fatal 11762 1726853282.60887: done checking for any_errors_fatal 11762 1726853282.60888: checking for max_fail_percentage 11762 1726853282.60890: done checking for max_fail_percentage 11762 1726853282.60891: checking to see if all hosts have failed and the running result is not ok 11762 1726853282.60892: done checking to see if all hosts have failed 11762 1726853282.60893: getting the remaining hosts for this loop 11762 1726853282.60895: done getting the remaining hosts for this loop 11762 1726853282.60899: getting the next task for host managed_node2 11762 1726853282.60907: done getting next task for host managed_node2 11762 1726853282.60910: ^ task is: TASK: Success in test '{{ lsr_description }}' 11762 1726853282.60913: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853282.60918: getting variables 11762 1726853282.60920: in VariableManager get_vars() 11762 1726853282.61078: Calling all_inventory to load vars for managed_node2 11762 1726853282.61081: Calling groups_inventory to load vars for managed_node2 11762 1726853282.61087: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853282.61099: Calling all_plugins_play to load vars for managed_node2 11762 1726853282.61103: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853282.61106: Calling groups_plugins_play to load vars for managed_node2 11762 1726853282.62817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853282.64532: done with get_vars() 11762 1726853282.64575: done getting variables 11762 1726853282.64639: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853282.64781: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 13:28:02 -0400 (0:00:00.119) 0:00:33.078 ****** 11762 1726853282.64811: entering _queue_task() for managed_node2/debug 11762 1726853282.65192: worker is 1 (out of 1 available) 11762 1726853282.65321: exiting _queue_task() for managed_node2/debug 11762 1726853282.65333: done queuing things up, now waiting for results queue to drain 11762 1726853282.65335: waiting for pending results... 11762 1726853282.65701: running TaskExecutor() for managed_node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 11762 1726853282.65706: in run() - task 02083763-bbaf-d845-03d0-00000000008f 11762 1726853282.65710: variable 'ansible_search_path' from source: unknown 11762 1726853282.65712: variable 'ansible_search_path' from source: unknown 11762 1726853282.65755: calling self._execute() 11762 1726853282.65877: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853282.65901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853282.65920: variable 'omit' from source: magic vars 11762 1726853282.66307: variable 'ansible_distribution_major_version' from source: facts 11762 1726853282.66325: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853282.66355: variable 'omit' from source: magic vars 11762 1726853282.66402: variable 'omit' from source: magic vars 11762 1726853282.66522: variable 'lsr_description' from source: include params 11762 1726853282.66670: variable 'omit' from source: magic vars 11762 1726853282.66676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853282.66679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853282.66689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853282.66711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853282.66727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853282.66776: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853282.66790: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853282.66799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853282.66919: Set connection var ansible_timeout to 10 11762 1726853282.66927: Set connection var ansible_shell_type to sh 11762 1726853282.66938: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853282.66951: Set connection var ansible_shell_executable to /bin/sh 11762 1726853282.66963: Set connection var ansible_pipelining to False 11762 1726853282.66977: Set connection var ansible_connection to ssh 11762 1726853282.67020: variable 'ansible_shell_executable' from source: unknown 11762 1726853282.67029: variable 'ansible_connection' from source: unknown 11762 1726853282.67037: variable 'ansible_module_compression' from source: unknown 11762 1726853282.67046: variable 'ansible_shell_type' from source: unknown 11762 1726853282.67053: variable 'ansible_shell_executable' from source: unknown 11762 1726853282.67103: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853282.67108: variable 'ansible_pipelining' from source: unknown 11762 1726853282.67110: variable 'ansible_timeout' from source: unknown 11762 1726853282.67112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853282.67250: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853282.67268: variable 'omit' from source: magic vars 11762 1726853282.67282: starting attempt loop 11762 1726853282.67318: running the handler 11762 1726853282.67360: handler run complete 11762 1726853282.67382: attempt loop complete, returning result 11762 1726853282.67390: _execute() done 11762 1726853282.67428: dumping result to json 11762 1726853282.67436: done dumping result, returning 11762 1726853282.67439: done running TaskExecutor() for managed_node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [02083763-bbaf-d845-03d0-00000000008f] 11762 1726853282.67441: sending task result for task 02083763-bbaf-d845-03d0-00000000008f ok: [managed_node2] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 11762 1726853282.67708: no more pending results, returning what we have 11762 1726853282.67712: results queue empty 11762 1726853282.67713: checking for any_errors_fatal 11762 1726853282.67721: done checking for any_errors_fatal 11762 1726853282.67722: checking for max_fail_percentage 11762 1726853282.67725: done checking for max_fail_percentage 11762 1726853282.67726: checking to see if all hosts have failed and the running result is not ok 11762 1726853282.67726: done checking to see if all hosts have failed 11762 1726853282.67727: getting the remaining hosts for this loop 11762 1726853282.67729: done getting the remaining hosts for this loop 11762 1726853282.67733: getting the next task for host managed_node2 11762 1726853282.67741: done getting next task for host managed_node2 11762 1726853282.67754: ^ task is: TASK: Cleanup 11762 1726853282.67758: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853282.67764: getting variables 11762 1726853282.67765: in VariableManager get_vars() 11762 1726853282.67802: Calling all_inventory to load vars for managed_node2 11762 1726853282.67805: Calling groups_inventory to load vars for managed_node2 11762 1726853282.67809: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853282.67822: Calling all_plugins_play to load vars for managed_node2 11762 1726853282.67825: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853282.67828: Calling groups_plugins_play to load vars for managed_node2 11762 1726853282.68369: done sending task result for task 02083763-bbaf-d845-03d0-00000000008f 11762 1726853282.68374: WORKER PROCESS EXITING 11762 1726853282.81254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853282.85136: done with get_vars() 11762 1726853282.85175: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 13:28:02 -0400 (0:00:00.205) 0:00:33.284 ****** 11762 1726853282.85388: entering _queue_task() for managed_node2/include_tasks 11762 1726853282.86237: worker is 1 (out of 1 available) 11762 1726853282.86252: exiting _queue_task() for managed_node2/include_tasks 11762 1726853282.86266: done queuing things up, now waiting for results queue to drain 11762 1726853282.86269: waiting for pending results... 11762 1726853282.87066: running TaskExecutor() for managed_node2/TASK: Cleanup 11762 1726853282.87189: in run() - task 02083763-bbaf-d845-03d0-000000000093 11762 1726853282.87480: variable 'ansible_search_path' from source: unknown 11762 1726853282.87484: variable 'ansible_search_path' from source: unknown 11762 1726853282.87537: variable 'lsr_cleanup' from source: include params 11762 1726853282.88250: variable 'lsr_cleanup' from source: include params 11762 1726853282.88329: variable 'omit' from source: magic vars 11762 1726853282.88875: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853282.88928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853282.88933: variable 'omit' from source: magic vars 11762 1726853282.89591: variable 'ansible_distribution_major_version' from source: facts 11762 1726853282.89595: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853282.89599: variable 'item' from source: unknown 11762 1726853282.89602: variable 'item' from source: unknown 11762 1726853282.89605: variable 'item' from source: unknown 11762 1726853282.89883: variable 'item' from source: unknown 11762 1726853282.90200: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853282.90204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853282.90206: variable 'omit' from source: magic vars 11762 1726853282.90208: variable 'ansible_distribution_major_version' from source: facts 11762 1726853282.90551: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853282.90555: variable 'item' from source: unknown 11762 1726853282.90557: variable 'item' from source: unknown 11762 1726853282.90560: variable 'item' from source: unknown 11762 1726853282.90740: variable 'item' from source: unknown 11762 1726853282.90981: dumping result to json 11762 1726853282.90984: done dumping result, returning 11762 1726853282.90986: done running TaskExecutor() for managed_node2/TASK: Cleanup [02083763-bbaf-d845-03d0-000000000093] 11762 1726853282.90989: sending task result for task 02083763-bbaf-d845-03d0-000000000093 11762 1726853282.91031: done sending task result for task 02083763-bbaf-d845-03d0-000000000093 11762 1726853282.91034: WORKER PROCESS EXITING 11762 1726853282.91108: no more pending results, returning what we have 11762 1726853282.91113: in VariableManager get_vars() 11762 1726853282.91164: Calling all_inventory to load vars for managed_node2 11762 1726853282.91168: Calling groups_inventory to load vars for managed_node2 11762 1726853282.91173: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853282.91187: Calling all_plugins_play to load vars for managed_node2 11762 1726853282.91192: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853282.91195: Calling groups_plugins_play to load vars for managed_node2 11762 1726853282.92891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853282.95418: done with get_vars() 11762 1726853282.95456: variable 'ansible_search_path' from source: unknown 11762 1726853282.95457: variable 'ansible_search_path' from source: unknown 11762 1726853282.95505: variable 'ansible_search_path' from source: unknown 11762 1726853282.95506: variable 'ansible_search_path' from source: unknown 11762 1726853282.95537: we have included files to process 11762 1726853282.95538: generating all_blocks data 11762 1726853282.95541: done generating all_blocks data 11762 1726853282.95548: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11762 1726853282.95550: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11762 1726853282.95553: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11762 1726853282.95808: in VariableManager get_vars() 11762 1726853282.95829: done with get_vars() 11762 1726853282.95835: variable 'omit' from source: magic vars 11762 1726853282.95883: variable 'omit' from source: magic vars 11762 1726853282.95948: in VariableManager get_vars() 11762 1726853282.95961: done with get_vars() 11762 1726853282.95989: in VariableManager get_vars() 11762 1726853282.96005: done with get_vars() 11762 1726853282.96048: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11762 1726853282.96227: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11762 1726853282.96307: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11762 1726853282.96689: in VariableManager get_vars() 11762 1726853282.96706: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11762 1726853282.98668: done processing included file 11762 1726853282.98672: iterating over new_blocks loaded from include file 11762 1726853282.98674: in VariableManager get_vars() 11762 1726853282.98765: done with get_vars() 11762 1726853282.98767: filtering new block on tags 11762 1726853282.99107: done filtering new block on tags 11762 1726853282.99111: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed_node2 => (item=tasks/cleanup_bond_profile+device.yml) 11762 1726853282.99117: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11762 1726853282.99118: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11762 1726853282.99121: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11762 1726853282.99511: done processing included file 11762 1726853282.99513: iterating over new_blocks loaded from include file 11762 1726853282.99514: in VariableManager get_vars() 11762 1726853282.99531: done with get_vars() 11762 1726853282.99533: filtering new block on tags 11762 1726853282.99566: done filtering new block on tags 11762 1726853282.99569: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed_node2 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 11762 1726853282.99575: extending task lists for all hosts with included blocks 11762 1726853283.02118: done extending task lists 11762 1726853283.02120: done processing included files 11762 1726853283.02121: results queue empty 11762 1726853283.02121: checking for any_errors_fatal 11762 1726853283.02126: done checking for any_errors_fatal 11762 1726853283.02126: checking for max_fail_percentage 11762 1726853283.02127: done checking for max_fail_percentage 11762 1726853283.02128: checking to see if all hosts have failed and the running result is not ok 11762 1726853283.02129: done checking to see if all hosts have failed 11762 1726853283.02130: getting the remaining hosts for this loop 11762 1726853283.02131: done getting the remaining hosts for this loop 11762 1726853283.02133: getting the next task for host managed_node2 11762 1726853283.02138: done getting next task for host managed_node2 11762 1726853283.02148: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11762 1726853283.02151: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853283.02160: getting variables 11762 1726853283.02161: in VariableManager get_vars() 11762 1726853283.02384: Calling all_inventory to load vars for managed_node2 11762 1726853283.02387: Calling groups_inventory to load vars for managed_node2 11762 1726853283.02389: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853283.02396: Calling all_plugins_play to load vars for managed_node2 11762 1726853283.02398: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853283.02401: Calling groups_plugins_play to load vars for managed_node2 11762 1726853283.04045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853283.05588: done with get_vars() 11762 1726853283.05612: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:28:03 -0400 (0:00:00.202) 0:00:33.487 ****** 11762 1726853283.05689: entering _queue_task() for managed_node2/include_tasks 11762 1726853283.06055: worker is 1 (out of 1 available) 11762 1726853283.06070: exiting _queue_task() for managed_node2/include_tasks 11762 1726853283.06302: done queuing things up, now waiting for results queue to drain 11762 1726853283.06304: waiting for pending results... 11762 1726853283.06577: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11762 1726853283.06618: in run() - task 02083763-bbaf-d845-03d0-000000000693 11762 1726853283.06676: variable 'ansible_search_path' from source: unknown 11762 1726853283.06684: variable 'ansible_search_path' from source: unknown 11762 1726853283.06731: calling self._execute() 11762 1726853283.06840: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853283.06877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853283.06880: variable 'omit' from source: magic vars 11762 1726853283.07259: variable 'ansible_distribution_major_version' from source: facts 11762 1726853283.07278: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853283.07351: _execute() done 11762 1726853283.07354: dumping result to json 11762 1726853283.07356: done dumping result, returning 11762 1726853283.07359: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-d845-03d0-000000000693] 11762 1726853283.07361: sending task result for task 02083763-bbaf-d845-03d0-000000000693 11762 1726853283.07440: done sending task result for task 02083763-bbaf-d845-03d0-000000000693 11762 1726853283.07498: no more pending results, returning what we have 11762 1726853283.07504: in VariableManager get_vars() 11762 1726853283.07553: Calling all_inventory to load vars for managed_node2 11762 1726853283.07557: Calling groups_inventory to load vars for managed_node2 11762 1726853283.07560: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853283.07574: Calling all_plugins_play to load vars for managed_node2 11762 1726853283.07577: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853283.07580: Calling groups_plugins_play to load vars for managed_node2 11762 1726853283.08284: WORKER PROCESS EXITING 11762 1726853283.09359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853283.11589: done with get_vars() 11762 1726853283.11619: variable 'ansible_search_path' from source: unknown 11762 1726853283.11620: variable 'ansible_search_path' from source: unknown 11762 1726853283.11782: we have included files to process 11762 1726853283.11784: generating all_blocks data 11762 1726853283.11786: done generating all_blocks data 11762 1726853283.11787: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853283.11788: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853283.11791: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853283.12529: done processing included file 11762 1726853283.12531: iterating over new_blocks loaded from include file 11762 1726853283.12533: in VariableManager get_vars() 11762 1726853283.12561: done with get_vars() 11762 1726853283.12564: filtering new block on tags 11762 1726853283.12595: done filtering new block on tags 11762 1726853283.12598: in VariableManager get_vars() 11762 1726853283.12622: done with get_vars() 11762 1726853283.12623: filtering new block on tags 11762 1726853283.12673: done filtering new block on tags 11762 1726853283.12676: in VariableManager get_vars() 11762 1726853283.12699: done with get_vars() 11762 1726853283.12701: filtering new block on tags 11762 1726853283.12744: done filtering new block on tags 11762 1726853283.12747: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 11762 1726853283.12751: extending task lists for all hosts with included blocks 11762 1726853283.14357: done extending task lists 11762 1726853283.14359: done processing included files 11762 1726853283.14360: results queue empty 11762 1726853283.14361: checking for any_errors_fatal 11762 1726853283.14365: done checking for any_errors_fatal 11762 1726853283.14366: checking for max_fail_percentage 11762 1726853283.14367: done checking for max_fail_percentage 11762 1726853283.14368: checking to see if all hosts have failed and the running result is not ok 11762 1726853283.14369: done checking to see if all hosts have failed 11762 1726853283.14369: getting the remaining hosts for this loop 11762 1726853283.14390: done getting the remaining hosts for this loop 11762 1726853283.14393: getting the next task for host managed_node2 11762 1726853283.14399: done getting next task for host managed_node2 11762 1726853283.14402: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11762 1726853283.14407: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853283.14417: getting variables 11762 1726853283.14418: in VariableManager get_vars() 11762 1726853283.14436: Calling all_inventory to load vars for managed_node2 11762 1726853283.14439: Calling groups_inventory to load vars for managed_node2 11762 1726853283.14441: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853283.14450: Calling all_plugins_play to load vars for managed_node2 11762 1726853283.14453: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853283.14456: Calling groups_plugins_play to load vars for managed_node2 11762 1726853283.16383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853283.18014: done with get_vars() 11762 1726853283.18037: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:28:03 -0400 (0:00:00.124) 0:00:33.611 ****** 11762 1726853283.18128: entering _queue_task() for managed_node2/setup 11762 1726853283.18518: worker is 1 (out of 1 available) 11762 1726853283.18531: exiting _queue_task() for managed_node2/setup 11762 1726853283.18549: done queuing things up, now waiting for results queue to drain 11762 1726853283.18552: waiting for pending results... 11762 1726853283.18839: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11762 1726853283.18995: in run() - task 02083763-bbaf-d845-03d0-0000000007c9 11762 1726853283.19011: variable 'ansible_search_path' from source: unknown 11762 1726853283.19015: variable 'ansible_search_path' from source: unknown 11762 1726853283.19052: calling self._execute() 11762 1726853283.19145: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853283.19150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853283.19158: variable 'omit' from source: magic vars 11762 1726853283.19534: variable 'ansible_distribution_major_version' from source: facts 11762 1726853283.19546: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853283.19755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853283.22412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853283.22452: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853283.22492: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853283.22539: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853283.22622: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853283.22683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853283.22733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853283.22759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853283.22805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853283.22830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853283.22873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853283.22956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853283.22960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853283.22990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853283.23048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853283.23217: variable '__network_required_facts' from source: role '' defaults 11762 1726853283.23241: variable 'ansible_facts' from source: unknown 11762 1726853283.24129: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11762 1726853283.24133: when evaluation is False, skipping this task 11762 1726853283.24138: _execute() done 11762 1726853283.24143: dumping result to json 11762 1726853283.24146: done dumping result, returning 11762 1726853283.24177: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-d845-03d0-0000000007c9] 11762 1726853283.24180: sending task result for task 02083763-bbaf-d845-03d0-0000000007c9 11762 1726853283.24481: done sending task result for task 02083763-bbaf-d845-03d0-0000000007c9 11762 1726853283.24485: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853283.24530: no more pending results, returning what we have 11762 1726853283.24533: results queue empty 11762 1726853283.24534: checking for any_errors_fatal 11762 1726853283.24535: done checking for any_errors_fatal 11762 1726853283.24536: checking for max_fail_percentage 11762 1726853283.24538: done checking for max_fail_percentage 11762 1726853283.24539: checking to see if all hosts have failed and the running result is not ok 11762 1726853283.24539: done checking to see if all hosts have failed 11762 1726853283.24540: getting the remaining hosts for this loop 11762 1726853283.24541: done getting the remaining hosts for this loop 11762 1726853283.24544: getting the next task for host managed_node2 11762 1726853283.24553: done getting next task for host managed_node2 11762 1726853283.24557: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11762 1726853283.24563: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853283.24580: getting variables 11762 1726853283.24582: in VariableManager get_vars() 11762 1726853283.24615: Calling all_inventory to load vars for managed_node2 11762 1726853283.24618: Calling groups_inventory to load vars for managed_node2 11762 1726853283.24621: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853283.24629: Calling all_plugins_play to load vars for managed_node2 11762 1726853283.24632: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853283.24644: Calling groups_plugins_play to load vars for managed_node2 11762 1726853283.26364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853283.28202: done with get_vars() 11762 1726853283.28230: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:28:03 -0400 (0:00:00.102) 0:00:33.713 ****** 11762 1726853283.28349: entering _queue_task() for managed_node2/stat 11762 1726853283.28719: worker is 1 (out of 1 available) 11762 1726853283.28975: exiting _queue_task() for managed_node2/stat 11762 1726853283.28988: done queuing things up, now waiting for results queue to drain 11762 1726853283.28990: waiting for pending results... 11762 1726853283.29087: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 11762 1726853283.29330: in run() - task 02083763-bbaf-d845-03d0-0000000007cb 11762 1726853283.29335: variable 'ansible_search_path' from source: unknown 11762 1726853283.29338: variable 'ansible_search_path' from source: unknown 11762 1726853283.29346: calling self._execute() 11762 1726853283.29454: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853283.29466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853283.29485: variable 'omit' from source: magic vars 11762 1726853283.29891: variable 'ansible_distribution_major_version' from source: facts 11762 1726853283.29909: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853283.30200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853283.30389: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853283.30451: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853283.30491: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853283.30541: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853283.30636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853283.30670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853283.30703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853283.30741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853283.30979: variable '__network_is_ostree' from source: set_fact 11762 1726853283.31089: Evaluated conditional (not __network_is_ostree is defined): False 11762 1726853283.31092: when evaluation is False, skipping this task 11762 1726853283.31139: _execute() done 11762 1726853283.31143: dumping result to json 11762 1726853283.31202: done dumping result, returning 11762 1726853283.31205: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-d845-03d0-0000000007cb] 11762 1726853283.31208: sending task result for task 02083763-bbaf-d845-03d0-0000000007cb skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11762 1726853283.31626: no more pending results, returning what we have 11762 1726853283.31631: results queue empty 11762 1726853283.31632: checking for any_errors_fatal 11762 1726853283.31642: done checking for any_errors_fatal 11762 1726853283.31642: checking for max_fail_percentage 11762 1726853283.31645: done checking for max_fail_percentage 11762 1726853283.31646: checking to see if all hosts have failed and the running result is not ok 11762 1726853283.31646: done checking to see if all hosts have failed 11762 1726853283.31647: getting the remaining hosts for this loop 11762 1726853283.31649: done getting the remaining hosts for this loop 11762 1726853283.31653: getting the next task for host managed_node2 11762 1726853283.31662: done getting next task for host managed_node2 11762 1726853283.31673: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11762 1726853283.31681: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853283.31698: getting variables 11762 1726853283.31700: in VariableManager get_vars() 11762 1726853283.31751: Calling all_inventory to load vars for managed_node2 11762 1726853283.31754: Calling groups_inventory to load vars for managed_node2 11762 1726853283.31758: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853283.32052: Calling all_plugins_play to load vars for managed_node2 11762 1726853283.32057: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853283.32061: Calling groups_plugins_play to load vars for managed_node2 11762 1726853283.32123: done sending task result for task 02083763-bbaf-d845-03d0-0000000007cb 11762 1726853283.32128: WORKER PROCESS EXITING 11762 1726853283.35340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853283.37166: done with get_vars() 11762 1726853283.37200: done getting variables 11762 1726853283.37273: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:28:03 -0400 (0:00:00.089) 0:00:33.803 ****** 11762 1726853283.37314: entering _queue_task() for managed_node2/set_fact 11762 1726853283.37959: worker is 1 (out of 1 available) 11762 1726853283.37974: exiting _queue_task() for managed_node2/set_fact 11762 1726853283.38128: done queuing things up, now waiting for results queue to drain 11762 1726853283.38130: waiting for pending results... 11762 1726853283.38334: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11762 1726853283.38500: in run() - task 02083763-bbaf-d845-03d0-0000000007cc 11762 1726853283.38523: variable 'ansible_search_path' from source: unknown 11762 1726853283.38560: variable 'ansible_search_path' from source: unknown 11762 1726853283.38588: calling self._execute() 11762 1726853283.38787: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853283.38863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853283.38875: variable 'omit' from source: magic vars 11762 1726853283.39139: variable 'ansible_distribution_major_version' from source: facts 11762 1726853283.39149: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853283.39264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853283.39489: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853283.39522: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853283.39548: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853283.39576: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853283.39645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853283.39661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853283.39694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853283.39709: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853283.39774: variable '__network_is_ostree' from source: set_fact 11762 1726853283.39780: Evaluated conditional (not __network_is_ostree is defined): False 11762 1726853283.39783: when evaluation is False, skipping this task 11762 1726853283.39786: _execute() done 11762 1726853283.39788: dumping result to json 11762 1726853283.39792: done dumping result, returning 11762 1726853283.39800: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-d845-03d0-0000000007cc] 11762 1726853283.39803: sending task result for task 02083763-bbaf-d845-03d0-0000000007cc 11762 1726853283.39894: done sending task result for task 02083763-bbaf-d845-03d0-0000000007cc 11762 1726853283.39897: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11762 1726853283.39946: no more pending results, returning what we have 11762 1726853283.39950: results queue empty 11762 1726853283.39951: checking for any_errors_fatal 11762 1726853283.39956: done checking for any_errors_fatal 11762 1726853283.39956: checking for max_fail_percentage 11762 1726853283.39958: done checking for max_fail_percentage 11762 1726853283.39959: checking to see if all hosts have failed and the running result is not ok 11762 1726853283.39960: done checking to see if all hosts have failed 11762 1726853283.39960: getting the remaining hosts for this loop 11762 1726853283.39963: done getting the remaining hosts for this loop 11762 1726853283.39966: getting the next task for host managed_node2 11762 1726853283.39979: done getting next task for host managed_node2 11762 1726853283.39982: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11762 1726853283.39989: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853283.40005: getting variables 11762 1726853283.40012: in VariableManager get_vars() 11762 1726853283.40049: Calling all_inventory to load vars for managed_node2 11762 1726853283.40052: Calling groups_inventory to load vars for managed_node2 11762 1726853283.40054: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853283.40063: Calling all_plugins_play to load vars for managed_node2 11762 1726853283.40065: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853283.40068: Calling groups_plugins_play to load vars for managed_node2 11762 1726853283.41201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853283.42372: done with get_vars() 11762 1726853283.42404: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:28:03 -0400 (0:00:00.053) 0:00:33.856 ****** 11762 1726853283.42628: entering _queue_task() for managed_node2/service_facts 11762 1726853283.43182: worker is 1 (out of 1 available) 11762 1726853283.43198: exiting _queue_task() for managed_node2/service_facts 11762 1726853283.43213: done queuing things up, now waiting for results queue to drain 11762 1726853283.43215: waiting for pending results... 11762 1726853283.43634: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 11762 1726853283.43779: in run() - task 02083763-bbaf-d845-03d0-0000000007ce 11762 1726853283.43783: variable 'ansible_search_path' from source: unknown 11762 1726853283.43787: variable 'ansible_search_path' from source: unknown 11762 1726853283.43816: calling self._execute() 11762 1726853283.43980: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853283.43985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853283.43990: variable 'omit' from source: magic vars 11762 1726853283.44516: variable 'ansible_distribution_major_version' from source: facts 11762 1726853283.44524: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853283.44531: variable 'omit' from source: magic vars 11762 1726853283.44696: variable 'omit' from source: magic vars 11762 1726853283.44711: variable 'omit' from source: magic vars 11762 1726853283.44730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853283.44790: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853283.44803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853283.44821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853283.44834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853283.44880: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853283.45000: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853283.45006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853283.45010: Set connection var ansible_timeout to 10 11762 1726853283.45021: Set connection var ansible_shell_type to sh 11762 1726853283.45024: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853283.45028: Set connection var ansible_shell_executable to /bin/sh 11762 1726853283.45082: Set connection var ansible_pipelining to False 11762 1726853283.45088: Set connection var ansible_connection to ssh 11762 1726853283.45095: variable 'ansible_shell_executable' from source: unknown 11762 1726853283.45099: variable 'ansible_connection' from source: unknown 11762 1726853283.45103: variable 'ansible_module_compression' from source: unknown 11762 1726853283.45106: variable 'ansible_shell_type' from source: unknown 11762 1726853283.45147: variable 'ansible_shell_executable' from source: unknown 11762 1726853283.45151: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853283.45153: variable 'ansible_pipelining' from source: unknown 11762 1726853283.45155: variable 'ansible_timeout' from source: unknown 11762 1726853283.45157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853283.45378: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853283.45424: variable 'omit' from source: magic vars 11762 1726853283.45428: starting attempt loop 11762 1726853283.45430: running the handler 11762 1726853283.45432: _low_level_execute_command(): starting 11762 1726853283.45435: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853283.46246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853283.46365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853283.46423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853283.46455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853283.48315: stdout chunk (state=3): >>>/root <<< 11762 1726853283.48422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853283.48425: stdout chunk (state=3): >>><<< 11762 1726853283.48428: stderr chunk (state=3): >>><<< 11762 1726853283.48450: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853283.48475: _low_level_execute_command(): starting 11762 1726853283.48560: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572 `" && echo ansible-tmp-1726853283.4845695-13404-215235283750572="` echo /root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572 `" ) && sleep 0' 11762 1726853283.49688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853283.49808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853283.49824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853283.49890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853283.50138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853283.52257: stdout chunk (state=3): >>>ansible-tmp-1726853283.4845695-13404-215235283750572=/root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572 <<< 11762 1726853283.52376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853283.52462: stderr chunk (state=3): >>><<< 11762 1726853283.52466: stdout chunk (state=3): >>><<< 11762 1726853283.52688: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853283.4845695-13404-215235283750572=/root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853283.52692: variable 'ansible_module_compression' from source: unknown 11762 1726853283.52694: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11762 1726853283.52696: variable 'ansible_facts' from source: unknown 11762 1726853283.52776: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572/AnsiballZ_service_facts.py 11762 1726853283.52943: Sending initial data 11762 1726853283.53000: Sent initial data (162 bytes) 11762 1726853283.53976: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853283.54030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853283.54091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853283.54160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853283.54390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853283.54429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853283.54663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853283.56363: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11762 1726853283.56406: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853283.56482: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853283.56579: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpkahy39xh /root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572/AnsiballZ_service_facts.py <<< 11762 1726853283.56582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572/AnsiballZ_service_facts.py" <<< 11762 1726853283.56696: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpkahy39xh" to remote "/root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572/AnsiballZ_service_facts.py" <<< 11762 1726853283.57788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853283.57919: stdout chunk (state=3): >>><<< 11762 1726853283.57922: stderr chunk (state=3): >>><<< 11762 1726853283.57931: done transferring module to remote 11762 1726853283.57950: _low_level_execute_command(): starting 11762 1726853283.57961: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572/ /root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572/AnsiballZ_service_facts.py && sleep 0' 11762 1726853283.59061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853283.59129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853283.59147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853283.59160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853283.59239: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853283.59258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853283.59277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853283.59309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853283.59416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853283.61567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853283.61574: stderr chunk (state=3): >>><<< 11762 1726853283.61577: stdout chunk (state=3): >>><<< 11762 1726853283.61720: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853283.61724: _low_level_execute_command(): starting 11762 1726853283.61727: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572/AnsiballZ_service_facts.py && sleep 0' 11762 1726853283.62980: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853283.63006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853283.63201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853283.63302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853283.63306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853283.63329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853283.63485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853285.30048: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 11762 1726853285.30082: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 11762 1726853285.30103: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 11762 1726853285.30135: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 11762 1726853285.30182: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11762 1726853285.31953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853285.31957: stdout chunk (state=3): >>><<< 11762 1726853285.31960: stderr chunk (state=3): >>><<< 11762 1726853285.32196: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853285.33877: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853285.33895: _low_level_execute_command(): starting 11762 1726853285.33905: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853283.4845695-13404-215235283750572/ > /dev/null 2>&1 && sleep 0' 11762 1726853285.35107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853285.35198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853285.35383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853285.35460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853285.35688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853285.38007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853285.38011: stdout chunk (state=3): >>><<< 11762 1726853285.38013: stderr chunk (state=3): >>><<< 11762 1726853285.38016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853285.38018: handler run complete 11762 1726853285.38546: variable 'ansible_facts' from source: unknown 11762 1726853285.38917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853285.40559: variable 'ansible_facts' from source: unknown 11762 1726853285.41156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853285.41696: attempt loop complete, returning result 11762 1726853285.41779: _execute() done 11762 1726853285.41885: dumping result to json 11762 1726853285.42127: done dumping result, returning 11762 1726853285.42145: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-d845-03d0-0000000007ce] 11762 1726853285.42164: sending task result for task 02083763-bbaf-d845-03d0-0000000007ce ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853285.45091: no more pending results, returning what we have 11762 1726853285.45094: results queue empty 11762 1726853285.45094: checking for any_errors_fatal 11762 1726853285.45098: done checking for any_errors_fatal 11762 1726853285.45098: checking for max_fail_percentage 11762 1726853285.45100: done checking for max_fail_percentage 11762 1726853285.45101: checking to see if all hosts have failed and the running result is not ok 11762 1726853285.45101: done checking to see if all hosts have failed 11762 1726853285.45102: getting the remaining hosts for this loop 11762 1726853285.45103: done getting the remaining hosts for this loop 11762 1726853285.45106: getting the next task for host managed_node2 11762 1726853285.45112: done getting next task for host managed_node2 11762 1726853285.45116: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11762 1726853285.45122: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853285.45131: getting variables 11762 1726853285.45132: in VariableManager get_vars() 11762 1726853285.45166: Calling all_inventory to load vars for managed_node2 11762 1726853285.45170: Calling groups_inventory to load vars for managed_node2 11762 1726853285.45201: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853285.45208: done sending task result for task 02083763-bbaf-d845-03d0-0000000007ce 11762 1726853285.45210: WORKER PROCESS EXITING 11762 1726853285.45219: Calling all_plugins_play to load vars for managed_node2 11762 1726853285.45221: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853285.45224: Calling groups_plugins_play to load vars for managed_node2 11762 1726853285.47127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853285.51566: done with get_vars() 11762 1726853285.51601: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:28:05 -0400 (0:00:02.092) 0:00:35.949 ****** 11762 1726853285.51895: entering _queue_task() for managed_node2/package_facts 11762 1726853285.52694: worker is 1 (out of 1 available) 11762 1726853285.52707: exiting _queue_task() for managed_node2/package_facts 11762 1726853285.52787: done queuing things up, now waiting for results queue to drain 11762 1726853285.52790: waiting for pending results... 11762 1726853285.53505: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 11762 1726853285.54039: in run() - task 02083763-bbaf-d845-03d0-0000000007cf 11762 1726853285.54103: variable 'ansible_search_path' from source: unknown 11762 1726853285.54107: variable 'ansible_search_path' from source: unknown 11762 1726853285.54147: calling self._execute() 11762 1726853285.54582: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853285.54676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853285.54690: variable 'omit' from source: magic vars 11762 1726853285.55530: variable 'ansible_distribution_major_version' from source: facts 11762 1726853285.55541: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853285.55547: variable 'omit' from source: magic vars 11762 1726853285.55761: variable 'omit' from source: magic vars 11762 1726853285.55796: variable 'omit' from source: magic vars 11762 1726853285.55981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853285.56061: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853285.56086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853285.56100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853285.56112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853285.56309: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853285.56312: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853285.56315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853285.56486: Set connection var ansible_timeout to 10 11762 1726853285.56489: Set connection var ansible_shell_type to sh 11762 1726853285.56494: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853285.56512: Set connection var ansible_shell_executable to /bin/sh 11762 1726853285.56533: Set connection var ansible_pipelining to False 11762 1726853285.56576: Set connection var ansible_connection to ssh 11762 1726853285.56580: variable 'ansible_shell_executable' from source: unknown 11762 1726853285.56582: variable 'ansible_connection' from source: unknown 11762 1726853285.56584: variable 'ansible_module_compression' from source: unknown 11762 1726853285.56586: variable 'ansible_shell_type' from source: unknown 11762 1726853285.56587: variable 'ansible_shell_executable' from source: unknown 11762 1726853285.56589: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853285.56591: variable 'ansible_pipelining' from source: unknown 11762 1726853285.56593: variable 'ansible_timeout' from source: unknown 11762 1726853285.56594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853285.56948: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853285.56957: variable 'omit' from source: magic vars 11762 1726853285.56960: starting attempt loop 11762 1726853285.56963: running the handler 11762 1726853285.56965: _low_level_execute_command(): starting 11762 1726853285.56967: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853285.57667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853285.57681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853285.57750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853285.57758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853285.57799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853285.57884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853285.59583: stdout chunk (state=3): >>>/root <<< 11762 1726853285.59740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853285.59781: stderr chunk (state=3): >>><<< 11762 1726853285.59784: stdout chunk (state=3): >>><<< 11762 1726853285.59802: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853285.59830: _low_level_execute_command(): starting 11762 1726853285.59833: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419 `" && echo ansible-tmp-1726853285.598018-13483-197209784803419="` echo /root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419 `" ) && sleep 0' 11762 1726853285.60487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853285.60492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853285.60536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853285.60552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853285.60578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853285.60682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853285.62694: stdout chunk (state=3): >>>ansible-tmp-1726853285.598018-13483-197209784803419=/root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419 <<< 11762 1726853285.62819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853285.62842: stderr chunk (state=3): >>><<< 11762 1726853285.62845: stdout chunk (state=3): >>><<< 11762 1726853285.62862: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853285.598018-13483-197209784803419=/root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853285.62909: variable 'ansible_module_compression' from source: unknown 11762 1726853285.62947: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11762 1726853285.63006: variable 'ansible_facts' from source: unknown 11762 1726853285.63124: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419/AnsiballZ_package_facts.py 11762 1726853285.63243: Sending initial data 11762 1726853285.63247: Sent initial data (161 bytes) 11762 1726853285.63815: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853285.63884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853285.63959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853285.65691: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853285.65755: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853285.65826: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp4a5shna6 /root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419/AnsiballZ_package_facts.py <<< 11762 1726853285.65836: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419/AnsiballZ_package_facts.py" <<< 11762 1726853285.65895: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp4a5shna6" to remote "/root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419/AnsiballZ_package_facts.py" <<< 11762 1726853285.65898: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419/AnsiballZ_package_facts.py" <<< 11762 1726853285.67188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853285.67239: stderr chunk (state=3): >>><<< 11762 1726853285.67367: stdout chunk (state=3): >>><<< 11762 1726853285.67372: done transferring module to remote 11762 1726853285.67375: _low_level_execute_command(): starting 11762 1726853285.67377: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419/ /root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419/AnsiballZ_package_facts.py && sleep 0' 11762 1726853285.68049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853285.68074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853285.68108: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853285.68166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853285.68312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853285.70264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853285.70270: stdout chunk (state=3): >>><<< 11762 1726853285.70275: stderr chunk (state=3): >>><<< 11762 1726853285.70435: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853285.70447: _low_level_execute_command(): starting 11762 1726853285.70450: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419/AnsiballZ_package_facts.py && sleep 0' 11762 1726853285.71381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853285.71385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853285.71388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853285.71390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853285.71460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853285.71504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853285.71597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853286.18004: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 11762 1726853286.18024: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 11762 1726853286.18030: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11762 1726853286.20179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853286.20183: stdout chunk (state=3): >>><<< 11762 1726853286.20185: stderr chunk (state=3): >>><<< 11762 1726853286.20251: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853286.22516: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853286.22531: _low_level_execute_command(): starting 11762 1726853286.22534: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853285.598018-13483-197209784803419/ > /dev/null 2>&1 && sleep 0' 11762 1726853286.23284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853286.23288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853286.23291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853286.23293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853286.23295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853286.23298: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853286.23318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853286.23320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853286.23322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853286.23324: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853286.23326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853286.23327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853286.23330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853286.23332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853286.23333: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853286.23335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853286.23432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853286.23436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853286.23463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853286.23554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853286.25830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853286.25834: stdout chunk (state=3): >>><<< 11762 1726853286.25836: stderr chunk (state=3): >>><<< 11762 1726853286.25854: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853286.25860: handler run complete 11762 1726853286.26903: variable 'ansible_facts' from source: unknown 11762 1726853286.27406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853286.30640: variable 'ansible_facts' from source: unknown 11762 1726853286.31121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853286.32272: attempt loop complete, returning result 11762 1726853286.32277: _execute() done 11762 1726853286.32280: dumping result to json 11762 1726853286.32447: done dumping result, returning 11762 1726853286.32496: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-d845-03d0-0000000007cf] 11762 1726853286.32508: sending task result for task 02083763-bbaf-d845-03d0-0000000007cf ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853286.38157: no more pending results, returning what we have 11762 1726853286.38160: results queue empty 11762 1726853286.38161: checking for any_errors_fatal 11762 1726853286.38165: done checking for any_errors_fatal 11762 1726853286.38166: checking for max_fail_percentage 11762 1726853286.38167: done checking for max_fail_percentage 11762 1726853286.38168: checking to see if all hosts have failed and the running result is not ok 11762 1726853286.38169: done checking to see if all hosts have failed 11762 1726853286.38169: getting the remaining hosts for this loop 11762 1726853286.38174: done getting the remaining hosts for this loop 11762 1726853286.38177: getting the next task for host managed_node2 11762 1726853286.38183: done getting next task for host managed_node2 11762 1726853286.38186: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11762 1726853286.38191: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853286.38201: getting variables 11762 1726853286.38202: in VariableManager get_vars() 11762 1726853286.38235: Calling all_inventory to load vars for managed_node2 11762 1726853286.38238: Calling groups_inventory to load vars for managed_node2 11762 1726853286.38241: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853286.38262: Calling all_plugins_play to load vars for managed_node2 11762 1726853286.38265: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853286.38268: Calling groups_plugins_play to load vars for managed_node2 11762 1726853286.38616: done sending task result for task 02083763-bbaf-d845-03d0-0000000007cf 11762 1726853286.38620: WORKER PROCESS EXITING 11762 1726853286.41540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853286.43647: done with get_vars() 11762 1726853286.43682: done getting variables 11762 1726853286.43758: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:28:06 -0400 (0:00:00.919) 0:00:36.868 ****** 11762 1726853286.43799: entering _queue_task() for managed_node2/debug 11762 1726853286.44484: worker is 1 (out of 1 available) 11762 1726853286.44498: exiting _queue_task() for managed_node2/debug 11762 1726853286.44509: done queuing things up, now waiting for results queue to drain 11762 1726853286.44512: waiting for pending results... 11762 1726853286.45394: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 11762 1726853286.45457: in run() - task 02083763-bbaf-d845-03d0-000000000694 11762 1726853286.45488: variable 'ansible_search_path' from source: unknown 11762 1726853286.45497: variable 'ansible_search_path' from source: unknown 11762 1726853286.45543: calling self._execute() 11762 1726853286.45640: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853286.45654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853286.45668: variable 'omit' from source: magic vars 11762 1726853286.46060: variable 'ansible_distribution_major_version' from source: facts 11762 1726853286.46079: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853286.46091: variable 'omit' from source: magic vars 11762 1726853286.46170: variable 'omit' from source: magic vars 11762 1726853286.46273: variable 'network_provider' from source: set_fact 11762 1726853286.46299: variable 'omit' from source: magic vars 11762 1726853286.46346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853286.46387: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853286.46412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853286.46435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853286.46456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853286.46495: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853286.46504: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853286.46513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853286.46677: Set connection var ansible_timeout to 10 11762 1726853286.46680: Set connection var ansible_shell_type to sh 11762 1726853286.46683: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853286.46688: Set connection var ansible_shell_executable to /bin/sh 11762 1726853286.46691: Set connection var ansible_pipelining to False 11762 1726853286.46693: Set connection var ansible_connection to ssh 11762 1726853286.46705: variable 'ansible_shell_executable' from source: unknown 11762 1726853286.46713: variable 'ansible_connection' from source: unknown 11762 1726853286.46721: variable 'ansible_module_compression' from source: unknown 11762 1726853286.46728: variable 'ansible_shell_type' from source: unknown 11762 1726853286.46738: variable 'ansible_shell_executable' from source: unknown 11762 1726853286.46752: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853286.46761: variable 'ansible_pipelining' from source: unknown 11762 1726853286.46772: variable 'ansible_timeout' from source: unknown 11762 1726853286.46785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853286.47176: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853286.47180: variable 'omit' from source: magic vars 11762 1726853286.47186: starting attempt loop 11762 1726853286.47188: running the handler 11762 1726853286.47190: handler run complete 11762 1726853286.47193: attempt loop complete, returning result 11762 1726853286.47195: _execute() done 11762 1726853286.47197: dumping result to json 11762 1726853286.47199: done dumping result, returning 11762 1726853286.47201: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-d845-03d0-000000000694] 11762 1726853286.47203: sending task result for task 02083763-bbaf-d845-03d0-000000000694 11762 1726853286.47276: done sending task result for task 02083763-bbaf-d845-03d0-000000000694 11762 1726853286.47280: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 11762 1726853286.47357: no more pending results, returning what we have 11762 1726853286.47360: results queue empty 11762 1726853286.47361: checking for any_errors_fatal 11762 1726853286.47368: done checking for any_errors_fatal 11762 1726853286.47369: checking for max_fail_percentage 11762 1726853286.47372: done checking for max_fail_percentage 11762 1726853286.47373: checking to see if all hosts have failed and the running result is not ok 11762 1726853286.47374: done checking to see if all hosts have failed 11762 1726853286.47374: getting the remaining hosts for this loop 11762 1726853286.47376: done getting the remaining hosts for this loop 11762 1726853286.47379: getting the next task for host managed_node2 11762 1726853286.47394: done getting next task for host managed_node2 11762 1726853286.47397: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11762 1726853286.47405: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853286.47419: getting variables 11762 1726853286.47421: in VariableManager get_vars() 11762 1726853286.47458: Calling all_inventory to load vars for managed_node2 11762 1726853286.47461: Calling groups_inventory to load vars for managed_node2 11762 1726853286.47463: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853286.47504: Calling all_plugins_play to load vars for managed_node2 11762 1726853286.47509: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853286.47512: Calling groups_plugins_play to load vars for managed_node2 11762 1726853286.49560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853286.51289: done with get_vars() 11762 1726853286.51316: done getting variables 11762 1726853286.51384: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:28:06 -0400 (0:00:00.076) 0:00:36.944 ****** 11762 1726853286.51428: entering _queue_task() for managed_node2/fail 11762 1726853286.52027: worker is 1 (out of 1 available) 11762 1726853286.52041: exiting _queue_task() for managed_node2/fail 11762 1726853286.52057: done queuing things up, now waiting for results queue to drain 11762 1726853286.52059: waiting for pending results... 11762 1726853286.52494: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11762 1726853286.52937: in run() - task 02083763-bbaf-d845-03d0-000000000695 11762 1726853286.52952: variable 'ansible_search_path' from source: unknown 11762 1726853286.52957: variable 'ansible_search_path' from source: unknown 11762 1726853286.53060: calling self._execute() 11762 1726853286.53397: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853286.53403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853286.53449: variable 'omit' from source: magic vars 11762 1726853286.54303: variable 'ansible_distribution_major_version' from source: facts 11762 1726853286.54333: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853286.54741: variable 'network_state' from source: role '' defaults 11762 1726853286.54861: Evaluated conditional (network_state != {}): False 11762 1726853286.54864: when evaluation is False, skipping this task 11762 1726853286.54866: _execute() done 11762 1726853286.54869: dumping result to json 11762 1726853286.54872: done dumping result, returning 11762 1726853286.54875: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-d845-03d0-000000000695] 11762 1726853286.54878: sending task result for task 02083763-bbaf-d845-03d0-000000000695 11762 1726853286.54957: done sending task result for task 02083763-bbaf-d845-03d0-000000000695 11762 1726853286.54962: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853286.55024: no more pending results, returning what we have 11762 1726853286.55029: results queue empty 11762 1726853286.55030: checking for any_errors_fatal 11762 1726853286.55038: done checking for any_errors_fatal 11762 1726853286.55038: checking for max_fail_percentage 11762 1726853286.55040: done checking for max_fail_percentage 11762 1726853286.55041: checking to see if all hosts have failed and the running result is not ok 11762 1726853286.55044: done checking to see if all hosts have failed 11762 1726853286.55046: getting the remaining hosts for this loop 11762 1726853286.55049: done getting the remaining hosts for this loop 11762 1726853286.55053: getting the next task for host managed_node2 11762 1726853286.55066: done getting next task for host managed_node2 11762 1726853286.55070: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11762 1726853286.55078: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853286.55097: getting variables 11762 1726853286.55099: in VariableManager get_vars() 11762 1726853286.55141: Calling all_inventory to load vars for managed_node2 11762 1726853286.55147: Calling groups_inventory to load vars for managed_node2 11762 1726853286.55149: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853286.55161: Calling all_plugins_play to load vars for managed_node2 11762 1726853286.55164: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853286.55167: Calling groups_plugins_play to load vars for managed_node2 11762 1726853286.56600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853286.60052: done with get_vars() 11762 1726853286.60166: done getting variables 11762 1726853286.60246: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:28:06 -0400 (0:00:00.088) 0:00:37.033 ****** 11762 1726853286.60295: entering _queue_task() for managed_node2/fail 11762 1726853286.60885: worker is 1 (out of 1 available) 11762 1726853286.60897: exiting _queue_task() for managed_node2/fail 11762 1726853286.60908: done queuing things up, now waiting for results queue to drain 11762 1726853286.60910: waiting for pending results... 11762 1726853286.61601: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11762 1726853286.61723: in run() - task 02083763-bbaf-d845-03d0-000000000696 11762 1726853286.61728: variable 'ansible_search_path' from source: unknown 11762 1726853286.61735: variable 'ansible_search_path' from source: unknown 11762 1726853286.61739: calling self._execute() 11762 1726853286.62077: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853286.62081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853286.62085: variable 'omit' from source: magic vars 11762 1726853286.62522: variable 'ansible_distribution_major_version' from source: facts 11762 1726853286.62541: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853286.62679: variable 'network_state' from source: role '' defaults 11762 1726853286.62713: Evaluated conditional (network_state != {}): False 11762 1726853286.62720: when evaluation is False, skipping this task 11762 1726853286.62724: _execute() done 11762 1726853286.62726: dumping result to json 11762 1726853286.62733: done dumping result, returning 11762 1726853286.62766: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-d845-03d0-000000000696] 11762 1726853286.62806: sending task result for task 02083763-bbaf-d845-03d0-000000000696 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853286.63237: no more pending results, returning what we have 11762 1726853286.63242: results queue empty 11762 1726853286.63243: checking for any_errors_fatal 11762 1726853286.63252: done checking for any_errors_fatal 11762 1726853286.63253: checking for max_fail_percentage 11762 1726853286.63256: done checking for max_fail_percentage 11762 1726853286.63257: checking to see if all hosts have failed and the running result is not ok 11762 1726853286.63257: done checking to see if all hosts have failed 11762 1726853286.63258: getting the remaining hosts for this loop 11762 1726853286.63260: done getting the remaining hosts for this loop 11762 1726853286.63263: getting the next task for host managed_node2 11762 1726853286.63278: done getting next task for host managed_node2 11762 1726853286.63283: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11762 1726853286.63289: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853286.63311: getting variables 11762 1726853286.63312: in VariableManager get_vars() 11762 1726853286.63355: Calling all_inventory to load vars for managed_node2 11762 1726853286.63358: Calling groups_inventory to load vars for managed_node2 11762 1726853286.63360: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853286.63489: Calling all_plugins_play to load vars for managed_node2 11762 1726853286.63494: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853286.63499: Calling groups_plugins_play to load vars for managed_node2 11762 1726853286.64106: done sending task result for task 02083763-bbaf-d845-03d0-000000000696 11762 1726853286.64110: WORKER PROCESS EXITING 11762 1726853286.65331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853286.67245: done with get_vars() 11762 1726853286.67272: done getting variables 11762 1726853286.67348: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:28:06 -0400 (0:00:00.070) 0:00:37.104 ****** 11762 1726853286.67391: entering _queue_task() for managed_node2/fail 11762 1726853286.67890: worker is 1 (out of 1 available) 11762 1726853286.67902: exiting _queue_task() for managed_node2/fail 11762 1726853286.67915: done queuing things up, now waiting for results queue to drain 11762 1726853286.67916: waiting for pending results... 11762 1726853286.68135: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11762 1726853286.68379: in run() - task 02083763-bbaf-d845-03d0-000000000697 11762 1726853286.68402: variable 'ansible_search_path' from source: unknown 11762 1726853286.68417: variable 'ansible_search_path' from source: unknown 11762 1726853286.68461: calling self._execute() 11762 1726853286.68581: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853286.68595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853286.68608: variable 'omit' from source: magic vars 11762 1726853286.69506: variable 'ansible_distribution_major_version' from source: facts 11762 1726853286.69561: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853286.69834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853286.72948: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853286.73037: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853286.73087: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853286.73134: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853286.73166: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853286.73262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853286.73307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853286.73576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853286.73579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853286.73581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853286.73583: variable 'ansible_distribution_major_version' from source: facts 11762 1726853286.73585: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11762 1726853286.73645: variable 'ansible_distribution' from source: facts 11762 1726853286.73656: variable '__network_rh_distros' from source: role '' defaults 11762 1726853286.73673: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11762 1726853286.73914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853286.73949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853286.73967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853286.74016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853286.74035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853286.74072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853286.74088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853286.74147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853286.74176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853286.74180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853286.74216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853286.74252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853286.74357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853286.74361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853286.74363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853286.74751: variable 'network_connections' from source: task vars 11762 1726853286.74766: variable 'port2_profile' from source: play vars 11762 1726853286.74856: variable 'port2_profile' from source: play vars 11762 1726853286.74875: variable 'port1_profile' from source: play vars 11762 1726853286.75030: variable 'port1_profile' from source: play vars 11762 1726853286.75033: variable 'controller_profile' from source: play vars 11762 1726853286.75079: variable 'controller_profile' from source: play vars 11762 1726853286.75101: variable 'network_state' from source: role '' defaults 11762 1726853286.75192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853286.75354: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853286.75385: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853286.75409: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853286.75432: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853286.75477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853286.75496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853286.75526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853286.75544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853286.75564: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11762 1726853286.75568: when evaluation is False, skipping this task 11762 1726853286.75570: _execute() done 11762 1726853286.75574: dumping result to json 11762 1726853286.75577: done dumping result, returning 11762 1726853286.75584: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-d845-03d0-000000000697] 11762 1726853286.75589: sending task result for task 02083763-bbaf-d845-03d0-000000000697 11762 1726853286.75682: done sending task result for task 02083763-bbaf-d845-03d0-000000000697 11762 1726853286.75685: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11762 1726853286.75753: no more pending results, returning what we have 11762 1726853286.75757: results queue empty 11762 1726853286.75758: checking for any_errors_fatal 11762 1726853286.75764: done checking for any_errors_fatal 11762 1726853286.75765: checking for max_fail_percentage 11762 1726853286.75767: done checking for max_fail_percentage 11762 1726853286.75767: checking to see if all hosts have failed and the running result is not ok 11762 1726853286.75768: done checking to see if all hosts have failed 11762 1726853286.75769: getting the remaining hosts for this loop 11762 1726853286.75776: done getting the remaining hosts for this loop 11762 1726853286.75785: getting the next task for host managed_node2 11762 1726853286.75794: done getting next task for host managed_node2 11762 1726853286.75799: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11762 1726853286.75804: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853286.75820: getting variables 11762 1726853286.75822: in VariableManager get_vars() 11762 1726853286.75870: Calling all_inventory to load vars for managed_node2 11762 1726853286.75989: Calling groups_inventory to load vars for managed_node2 11762 1726853286.75993: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853286.76002: Calling all_plugins_play to load vars for managed_node2 11762 1726853286.76007: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853286.76011: Calling groups_plugins_play to load vars for managed_node2 11762 1726853286.77298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853286.79875: done with get_vars() 11762 1726853286.79907: done getting variables 11762 1726853286.80023: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:28:06 -0400 (0:00:00.126) 0:00:37.231 ****** 11762 1726853286.80064: entering _queue_task() for managed_node2/dnf 11762 1726853286.81003: worker is 1 (out of 1 available) 11762 1726853286.81017: exiting _queue_task() for managed_node2/dnf 11762 1726853286.81103: done queuing things up, now waiting for results queue to drain 11762 1726853286.81105: waiting for pending results... 11762 1726853286.81490: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11762 1726853286.81581: in run() - task 02083763-bbaf-d845-03d0-000000000698 11762 1726853286.81586: variable 'ansible_search_path' from source: unknown 11762 1726853286.81589: variable 'ansible_search_path' from source: unknown 11762 1726853286.81593: calling self._execute() 11762 1726853286.81675: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853286.81724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853286.81733: variable 'omit' from source: magic vars 11762 1726853286.82055: variable 'ansible_distribution_major_version' from source: facts 11762 1726853286.82059: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853286.82234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853286.86761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853286.87001: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853286.87041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853286.87088: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853286.87103: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853286.87584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853286.87612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853286.87637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853286.87877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853286.87957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853286.88252: variable 'ansible_distribution' from source: facts 11762 1726853286.88255: variable 'ansible_distribution_major_version' from source: facts 11762 1726853286.88284: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11762 1726853286.88792: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853286.88922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853286.88948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853286.88970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853286.89381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853286.89384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853286.89387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853286.89406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853286.89431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853286.89468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853286.89934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853286.90031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853286.90034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853286.90037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853286.90054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853286.90068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853286.90746: variable 'network_connections' from source: task vars 11762 1726853286.90759: variable 'port2_profile' from source: play vars 11762 1726853286.91027: variable 'port2_profile' from source: play vars 11762 1726853286.91176: variable 'port1_profile' from source: play vars 11762 1726853286.91179: variable 'port1_profile' from source: play vars 11762 1726853286.91182: variable 'controller_profile' from source: play vars 11762 1726853286.91184: variable 'controller_profile' from source: play vars 11762 1726853286.91449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853286.91886: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853286.92042: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853286.92070: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853286.92102: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853286.92261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853286.92285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853286.92310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853286.92481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853286.92525: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853286.93180: variable 'network_connections' from source: task vars 11762 1726853286.93183: variable 'port2_profile' from source: play vars 11762 1726853286.93485: variable 'port2_profile' from source: play vars 11762 1726853286.93488: variable 'port1_profile' from source: play vars 11762 1726853286.93592: variable 'port1_profile' from source: play vars 11762 1726853286.93596: variable 'controller_profile' from source: play vars 11762 1726853286.93763: variable 'controller_profile' from source: play vars 11762 1726853286.93793: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853286.93797: when evaluation is False, skipping this task 11762 1726853286.93799: _execute() done 11762 1726853286.93802: dumping result to json 11762 1726853286.93805: done dumping result, returning 11762 1726853286.93831: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000698] 11762 1726853286.93834: sending task result for task 02083763-bbaf-d845-03d0-000000000698 11762 1726853286.94261: done sending task result for task 02083763-bbaf-d845-03d0-000000000698 11762 1726853286.94264: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853286.94320: no more pending results, returning what we have 11762 1726853286.94323: results queue empty 11762 1726853286.94324: checking for any_errors_fatal 11762 1726853286.94330: done checking for any_errors_fatal 11762 1726853286.94331: checking for max_fail_percentage 11762 1726853286.94333: done checking for max_fail_percentage 11762 1726853286.94333: checking to see if all hosts have failed and the running result is not ok 11762 1726853286.94334: done checking to see if all hosts have failed 11762 1726853286.94335: getting the remaining hosts for this loop 11762 1726853286.94336: done getting the remaining hosts for this loop 11762 1726853286.94339: getting the next task for host managed_node2 11762 1726853286.94348: done getting next task for host managed_node2 11762 1726853286.94351: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11762 1726853286.94356: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853286.94373: getting variables 11762 1726853286.94375: in VariableManager get_vars() 11762 1726853286.94411: Calling all_inventory to load vars for managed_node2 11762 1726853286.94413: Calling groups_inventory to load vars for managed_node2 11762 1726853286.94415: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853286.94424: Calling all_plugins_play to load vars for managed_node2 11762 1726853286.94426: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853286.94428: Calling groups_plugins_play to load vars for managed_node2 11762 1726853286.96132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853286.98811: done with get_vars() 11762 1726853286.98847: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11762 1726853286.99114: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:28:06 -0400 (0:00:00.190) 0:00:37.422 ****** 11762 1726853286.99161: entering _queue_task() for managed_node2/yum 11762 1726853286.99763: worker is 1 (out of 1 available) 11762 1726853286.99898: exiting _queue_task() for managed_node2/yum 11762 1726853286.99911: done queuing things up, now waiting for results queue to drain 11762 1726853286.99913: waiting for pending results... 11762 1726853287.00340: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11762 1726853287.00348: in run() - task 02083763-bbaf-d845-03d0-000000000699 11762 1726853287.00352: variable 'ansible_search_path' from source: unknown 11762 1726853287.00355: variable 'ansible_search_path' from source: unknown 11762 1726853287.00398: calling self._execute() 11762 1726853287.00506: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853287.00520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853287.00558: variable 'omit' from source: magic vars 11762 1726853287.01046: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.01098: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853287.01276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853287.06035: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853287.06040: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853287.06156: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853287.06253: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853287.06304: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853287.06581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.06599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.06630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.06731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.06753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.06976: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.07037: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11762 1726853287.07131: when evaluation is False, skipping this task 11762 1726853287.07140: _execute() done 11762 1726853287.07150: dumping result to json 11762 1726853287.07158: done dumping result, returning 11762 1726853287.07170: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000699] 11762 1726853287.07183: sending task result for task 02083763-bbaf-d845-03d0-000000000699 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11762 1726853287.07365: no more pending results, returning what we have 11762 1726853287.07370: results queue empty 11762 1726853287.07373: checking for any_errors_fatal 11762 1726853287.07380: done checking for any_errors_fatal 11762 1726853287.07381: checking for max_fail_percentage 11762 1726853287.07383: done checking for max_fail_percentage 11762 1726853287.07384: checking to see if all hosts have failed and the running result is not ok 11762 1726853287.07385: done checking to see if all hosts have failed 11762 1726853287.07386: getting the remaining hosts for this loop 11762 1726853287.07388: done getting the remaining hosts for this loop 11762 1726853287.07392: getting the next task for host managed_node2 11762 1726853287.07400: done getting next task for host managed_node2 11762 1726853287.07416: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11762 1726853287.07423: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853287.07584: getting variables 11762 1726853287.07586: in VariableManager get_vars() 11762 1726853287.07629: Calling all_inventory to load vars for managed_node2 11762 1726853287.07632: Calling groups_inventory to load vars for managed_node2 11762 1726853287.07635: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853287.07648: Calling all_plugins_play to load vars for managed_node2 11762 1726853287.07651: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853287.07655: Calling groups_plugins_play to load vars for managed_node2 11762 1726853287.08442: done sending task result for task 02083763-bbaf-d845-03d0-000000000699 11762 1726853287.08446: WORKER PROCESS EXITING 11762 1726853287.11719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853287.23338: done with get_vars() 11762 1726853287.23559: done getting variables 11762 1726853287.23616: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:28:07 -0400 (0:00:00.244) 0:00:37.666 ****** 11762 1726853287.23651: entering _queue_task() for managed_node2/fail 11762 1726853287.24325: worker is 1 (out of 1 available) 11762 1726853287.24339: exiting _queue_task() for managed_node2/fail 11762 1726853287.24351: done queuing things up, now waiting for results queue to drain 11762 1726853287.24354: waiting for pending results... 11762 1726853287.24721: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11762 1726853287.24862: in run() - task 02083763-bbaf-d845-03d0-00000000069a 11762 1726853287.24883: variable 'ansible_search_path' from source: unknown 11762 1726853287.24888: variable 'ansible_search_path' from source: unknown 11762 1726853287.24926: calling self._execute() 11762 1726853287.25067: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853287.25074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853287.25078: variable 'omit' from source: magic vars 11762 1726853287.25510: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.25521: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853287.25611: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853287.25751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853287.27433: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853287.27503: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853287.27540: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853287.27581: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853287.27607: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853287.27692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.27721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.27742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.27777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.27788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.27821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.27838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.27864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.27899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.27932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.27993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.27998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.28029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.28057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.28155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.28376: variable 'network_connections' from source: task vars 11762 1726853287.28379: variable 'port2_profile' from source: play vars 11762 1726853287.28401: variable 'port2_profile' from source: play vars 11762 1726853287.28418: variable 'port1_profile' from source: play vars 11762 1726853287.28483: variable 'port1_profile' from source: play vars 11762 1726853287.28510: variable 'controller_profile' from source: play vars 11762 1726853287.28569: variable 'controller_profile' from source: play vars 11762 1726853287.28660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853287.28876: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853287.28921: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853287.28961: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853287.28984: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853287.29015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853287.29031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853287.29054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.29074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853287.29114: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853287.29267: variable 'network_connections' from source: task vars 11762 1726853287.29272: variable 'port2_profile' from source: play vars 11762 1726853287.29316: variable 'port2_profile' from source: play vars 11762 1726853287.29323: variable 'port1_profile' from source: play vars 11762 1726853287.29374: variable 'port1_profile' from source: play vars 11762 1726853287.29382: variable 'controller_profile' from source: play vars 11762 1726853287.29424: variable 'controller_profile' from source: play vars 11762 1726853287.29444: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853287.29457: when evaluation is False, skipping this task 11762 1726853287.29459: _execute() done 11762 1726853287.29462: dumping result to json 11762 1726853287.29464: done dumping result, returning 11762 1726853287.29467: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-d845-03d0-00000000069a] 11762 1726853287.29469: sending task result for task 02083763-bbaf-d845-03d0-00000000069a 11762 1726853287.29567: done sending task result for task 02083763-bbaf-d845-03d0-00000000069a 11762 1726853287.29569: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853287.29639: no more pending results, returning what we have 11762 1726853287.29643: results queue empty 11762 1726853287.29644: checking for any_errors_fatal 11762 1726853287.29652: done checking for any_errors_fatal 11762 1726853287.29652: checking for max_fail_percentage 11762 1726853287.29654: done checking for max_fail_percentage 11762 1726853287.29655: checking to see if all hosts have failed and the running result is not ok 11762 1726853287.29655: done checking to see if all hosts have failed 11762 1726853287.29656: getting the remaining hosts for this loop 11762 1726853287.29658: done getting the remaining hosts for this loop 11762 1726853287.29662: getting the next task for host managed_node2 11762 1726853287.29669: done getting next task for host managed_node2 11762 1726853287.29674: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11762 1726853287.29679: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853287.29696: getting variables 11762 1726853287.29698: in VariableManager get_vars() 11762 1726853287.29733: Calling all_inventory to load vars for managed_node2 11762 1726853287.29735: Calling groups_inventory to load vars for managed_node2 11762 1726853287.29737: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853287.29746: Calling all_plugins_play to load vars for managed_node2 11762 1726853287.29748: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853287.29751: Calling groups_plugins_play to load vars for managed_node2 11762 1726853287.30553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853287.31428: done with get_vars() 11762 1726853287.31447: done getting variables 11762 1726853287.31493: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:28:07 -0400 (0:00:00.078) 0:00:37.745 ****** 11762 1726853287.31521: entering _queue_task() for managed_node2/package 11762 1726853287.31787: worker is 1 (out of 1 available) 11762 1726853287.31802: exiting _queue_task() for managed_node2/package 11762 1726853287.31816: done queuing things up, now waiting for results queue to drain 11762 1726853287.31818: waiting for pending results... 11762 1726853287.32011: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 11762 1726853287.32162: in run() - task 02083763-bbaf-d845-03d0-00000000069b 11762 1726853287.32166: variable 'ansible_search_path' from source: unknown 11762 1726853287.32170: variable 'ansible_search_path' from source: unknown 11762 1726853287.32214: calling self._execute() 11762 1726853287.32302: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853287.32305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853287.32349: variable 'omit' from source: magic vars 11762 1726853287.32712: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.32715: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853287.32921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853287.33195: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853287.33234: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853287.33273: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853287.33341: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853287.33697: variable 'network_packages' from source: role '' defaults 11762 1726853287.33700: variable '__network_provider_setup' from source: role '' defaults 11762 1726853287.33702: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853287.33705: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853287.33707: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853287.33709: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853287.33866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853287.36167: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853287.36226: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853287.36255: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853287.36285: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853287.36307: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853287.36387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.36411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.36433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.36478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.36494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.36540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.36606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.36609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.36682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.36685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.36892: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11762 1726853287.37014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.37026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.37075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.37089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.37105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.37191: variable 'ansible_python' from source: facts 11762 1726853287.37213: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11762 1726853287.37290: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853287.37363: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853287.37475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.37676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.37680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.37683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.37685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.37687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.37699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.37701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.37704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.37706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.37855: variable 'network_connections' from source: task vars 11762 1726853287.37859: variable 'port2_profile' from source: play vars 11762 1726853287.37912: variable 'port2_profile' from source: play vars 11762 1726853287.37921: variable 'port1_profile' from source: play vars 11762 1726853287.38008: variable 'port1_profile' from source: play vars 11762 1726853287.38011: variable 'controller_profile' from source: play vars 11762 1726853287.38086: variable 'controller_profile' from source: play vars 11762 1726853287.38136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853287.38156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853287.38181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.38208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853287.38258: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853287.38475: variable 'network_connections' from source: task vars 11762 1726853287.38479: variable 'port2_profile' from source: play vars 11762 1726853287.38550: variable 'port2_profile' from source: play vars 11762 1726853287.38558: variable 'port1_profile' from source: play vars 11762 1726853287.38630: variable 'port1_profile' from source: play vars 11762 1726853287.38638: variable 'controller_profile' from source: play vars 11762 1726853287.38717: variable 'controller_profile' from source: play vars 11762 1726853287.38759: variable '__network_packages_default_wireless' from source: role '' defaults 11762 1726853287.38824: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853287.39028: variable 'network_connections' from source: task vars 11762 1726853287.39031: variable 'port2_profile' from source: play vars 11762 1726853287.39081: variable 'port2_profile' from source: play vars 11762 1726853287.39087: variable 'port1_profile' from source: play vars 11762 1726853287.39130: variable 'port1_profile' from source: play vars 11762 1726853287.39137: variable 'controller_profile' from source: play vars 11762 1726853287.39185: variable 'controller_profile' from source: play vars 11762 1726853287.39202: variable '__network_packages_default_team' from source: role '' defaults 11762 1726853287.39255: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853287.39450: variable 'network_connections' from source: task vars 11762 1726853287.39453: variable 'port2_profile' from source: play vars 11762 1726853287.39501: variable 'port2_profile' from source: play vars 11762 1726853287.39507: variable 'port1_profile' from source: play vars 11762 1726853287.39550: variable 'port1_profile' from source: play vars 11762 1726853287.39557: variable 'controller_profile' from source: play vars 11762 1726853287.39606: variable 'controller_profile' from source: play vars 11762 1726853287.39644: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853287.39684: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853287.39690: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853287.39734: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853287.39870: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11762 1726853287.40166: variable 'network_connections' from source: task vars 11762 1726853287.40170: variable 'port2_profile' from source: play vars 11762 1726853287.40214: variable 'port2_profile' from source: play vars 11762 1726853287.40220: variable 'port1_profile' from source: play vars 11762 1726853287.40264: variable 'port1_profile' from source: play vars 11762 1726853287.40276: variable 'controller_profile' from source: play vars 11762 1726853287.40313: variable 'controller_profile' from source: play vars 11762 1726853287.40320: variable 'ansible_distribution' from source: facts 11762 1726853287.40323: variable '__network_rh_distros' from source: role '' defaults 11762 1726853287.40328: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.40340: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11762 1726853287.40449: variable 'ansible_distribution' from source: facts 11762 1726853287.40452: variable '__network_rh_distros' from source: role '' defaults 11762 1726853287.40454: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.40466: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11762 1726853287.40573: variable 'ansible_distribution' from source: facts 11762 1726853287.40576: variable '__network_rh_distros' from source: role '' defaults 11762 1726853287.40582: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.40611: variable 'network_provider' from source: set_fact 11762 1726853287.40624: variable 'ansible_facts' from source: unknown 11762 1726853287.41011: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11762 1726853287.41014: when evaluation is False, skipping this task 11762 1726853287.41019: _execute() done 11762 1726853287.41022: dumping result to json 11762 1726853287.41026: done dumping result, returning 11762 1726853287.41029: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-d845-03d0-00000000069b] 11762 1726853287.41040: sending task result for task 02083763-bbaf-d845-03d0-00000000069b 11762 1726853287.41132: done sending task result for task 02083763-bbaf-d845-03d0-00000000069b 11762 1726853287.41135: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11762 1726853287.41195: no more pending results, returning what we have 11762 1726853287.41199: results queue empty 11762 1726853287.41199: checking for any_errors_fatal 11762 1726853287.41206: done checking for any_errors_fatal 11762 1726853287.41207: checking for max_fail_percentage 11762 1726853287.41208: done checking for max_fail_percentage 11762 1726853287.41209: checking to see if all hosts have failed and the running result is not ok 11762 1726853287.41210: done checking to see if all hosts have failed 11762 1726853287.41210: getting the remaining hosts for this loop 11762 1726853287.41212: done getting the remaining hosts for this loop 11762 1726853287.41221: getting the next task for host managed_node2 11762 1726853287.41228: done getting next task for host managed_node2 11762 1726853287.41232: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11762 1726853287.41237: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853287.41258: getting variables 11762 1726853287.41260: in VariableManager get_vars() 11762 1726853287.41300: Calling all_inventory to load vars for managed_node2 11762 1726853287.41303: Calling groups_inventory to load vars for managed_node2 11762 1726853287.41305: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853287.41314: Calling all_plugins_play to load vars for managed_node2 11762 1726853287.41316: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853287.41319: Calling groups_plugins_play to load vars for managed_node2 11762 1726853287.42276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853287.43324: done with get_vars() 11762 1726853287.43342: done getting variables 11762 1726853287.43401: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:28:07 -0400 (0:00:00.119) 0:00:37.864 ****** 11762 1726853287.43444: entering _queue_task() for managed_node2/package 11762 1726853287.43730: worker is 1 (out of 1 available) 11762 1726853287.43743: exiting _queue_task() for managed_node2/package 11762 1726853287.43758: done queuing things up, now waiting for results queue to drain 11762 1726853287.43760: waiting for pending results... 11762 1726853287.44004: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11762 1726853287.44134: in run() - task 02083763-bbaf-d845-03d0-00000000069c 11762 1726853287.44156: variable 'ansible_search_path' from source: unknown 11762 1726853287.44160: variable 'ansible_search_path' from source: unknown 11762 1726853287.44202: calling self._execute() 11762 1726853287.44309: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853287.44312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853287.44315: variable 'omit' from source: magic vars 11762 1726853287.44641: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.44648: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853287.44772: variable 'network_state' from source: role '' defaults 11762 1726853287.44776: Evaluated conditional (network_state != {}): False 11762 1726853287.44779: when evaluation is False, skipping this task 11762 1726853287.44787: _execute() done 11762 1726853287.44791: dumping result to json 11762 1726853287.44794: done dumping result, returning 11762 1726853287.44796: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-d845-03d0-00000000069c] 11762 1726853287.44799: sending task result for task 02083763-bbaf-d845-03d0-00000000069c 11762 1726853287.44912: done sending task result for task 02083763-bbaf-d845-03d0-00000000069c 11762 1726853287.44915: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853287.44976: no more pending results, returning what we have 11762 1726853287.44980: results queue empty 11762 1726853287.44981: checking for any_errors_fatal 11762 1726853287.44986: done checking for any_errors_fatal 11762 1726853287.44987: checking for max_fail_percentage 11762 1726853287.44988: done checking for max_fail_percentage 11762 1726853287.44989: checking to see if all hosts have failed and the running result is not ok 11762 1726853287.44989: done checking to see if all hosts have failed 11762 1726853287.44990: getting the remaining hosts for this loop 11762 1726853287.44992: done getting the remaining hosts for this loop 11762 1726853287.44995: getting the next task for host managed_node2 11762 1726853287.45001: done getting next task for host managed_node2 11762 1726853287.45004: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11762 1726853287.45009: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853287.45024: getting variables 11762 1726853287.45025: in VariableManager get_vars() 11762 1726853287.45066: Calling all_inventory to load vars for managed_node2 11762 1726853287.45069: Calling groups_inventory to load vars for managed_node2 11762 1726853287.45075: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853287.45087: Calling all_plugins_play to load vars for managed_node2 11762 1726853287.45090: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853287.45093: Calling groups_plugins_play to load vars for managed_node2 11762 1726853287.46052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853287.46954: done with get_vars() 11762 1726853287.46968: done getting variables 11762 1726853287.47012: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:28:07 -0400 (0:00:00.035) 0:00:37.900 ****** 11762 1726853287.47037: entering _queue_task() for managed_node2/package 11762 1726853287.47265: worker is 1 (out of 1 available) 11762 1726853287.47280: exiting _queue_task() for managed_node2/package 11762 1726853287.47293: done queuing things up, now waiting for results queue to drain 11762 1726853287.47295: waiting for pending results... 11762 1726853287.47587: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11762 1726853287.47699: in run() - task 02083763-bbaf-d845-03d0-00000000069d 11762 1726853287.47723: variable 'ansible_search_path' from source: unknown 11762 1726853287.47768: variable 'ansible_search_path' from source: unknown 11762 1726853287.47810: calling self._execute() 11762 1726853287.47904: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853287.47916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853287.47931: variable 'omit' from source: magic vars 11762 1726853287.48482: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.48576: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853287.48645: variable 'network_state' from source: role '' defaults 11762 1726853287.48661: Evaluated conditional (network_state != {}): False 11762 1726853287.48668: when evaluation is False, skipping this task 11762 1726853287.48678: _execute() done 11762 1726853287.48686: dumping result to json 11762 1726853287.48694: done dumping result, returning 11762 1726853287.48707: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-d845-03d0-00000000069d] 11762 1726853287.48718: sending task result for task 02083763-bbaf-d845-03d0-00000000069d 11762 1726853287.48940: done sending task result for task 02083763-bbaf-d845-03d0-00000000069d 11762 1726853287.48944: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853287.48996: no more pending results, returning what we have 11762 1726853287.49000: results queue empty 11762 1726853287.49000: checking for any_errors_fatal 11762 1726853287.49016: done checking for any_errors_fatal 11762 1726853287.49017: checking for max_fail_percentage 11762 1726853287.49023: done checking for max_fail_percentage 11762 1726853287.49024: checking to see if all hosts have failed and the running result is not ok 11762 1726853287.49024: done checking to see if all hosts have failed 11762 1726853287.49025: getting the remaining hosts for this loop 11762 1726853287.49027: done getting the remaining hosts for this loop 11762 1726853287.49031: getting the next task for host managed_node2 11762 1726853287.49038: done getting next task for host managed_node2 11762 1726853287.49044: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11762 1726853287.49050: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853287.49067: getting variables 11762 1726853287.49069: in VariableManager get_vars() 11762 1726853287.49109: Calling all_inventory to load vars for managed_node2 11762 1726853287.49139: Calling groups_inventory to load vars for managed_node2 11762 1726853287.49144: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853287.49153: Calling all_plugins_play to load vars for managed_node2 11762 1726853287.49155: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853287.49158: Calling groups_plugins_play to load vars for managed_node2 11762 1726853287.50054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853287.51233: done with get_vars() 11762 1726853287.51258: done getting variables 11762 1726853287.51319: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:28:07 -0400 (0:00:00.043) 0:00:37.944 ****** 11762 1726853287.51357: entering _queue_task() for managed_node2/service 11762 1726853287.51680: worker is 1 (out of 1 available) 11762 1726853287.51693: exiting _queue_task() for managed_node2/service 11762 1726853287.51709: done queuing things up, now waiting for results queue to drain 11762 1726853287.51710: waiting for pending results... 11762 1726853287.52126: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11762 1726853287.52230: in run() - task 02083763-bbaf-d845-03d0-00000000069e 11762 1726853287.52236: variable 'ansible_search_path' from source: unknown 11762 1726853287.52240: variable 'ansible_search_path' from source: unknown 11762 1726853287.52243: calling self._execute() 11762 1726853287.52403: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853287.52410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853287.52416: variable 'omit' from source: magic vars 11762 1726853287.52804: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.52807: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853287.53080: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853287.53286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853287.56421: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853287.56500: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853287.56649: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853287.56653: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853287.56655: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853287.56706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.56740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.56806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.57145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.57149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.57151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.57153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.57156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.57224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.57227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.57313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.57317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.57454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.57493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.57506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.57751: variable 'network_connections' from source: task vars 11762 1726853287.57754: variable 'port2_profile' from source: play vars 11762 1726853287.57995: variable 'port2_profile' from source: play vars 11762 1726853287.58007: variable 'port1_profile' from source: play vars 11762 1726853287.58091: variable 'port1_profile' from source: play vars 11762 1726853287.58095: variable 'controller_profile' from source: play vars 11762 1726853287.58143: variable 'controller_profile' from source: play vars 11762 1726853287.58230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853287.58889: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853287.58925: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853287.59290: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853287.59324: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853287.59378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853287.59513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853287.59541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.59593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853287.59928: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853287.60075: variable 'network_connections' from source: task vars 11762 1726853287.60081: variable 'port2_profile' from source: play vars 11762 1726853287.60142: variable 'port2_profile' from source: play vars 11762 1726853287.60157: variable 'port1_profile' from source: play vars 11762 1726853287.60211: variable 'port1_profile' from source: play vars 11762 1726853287.60218: variable 'controller_profile' from source: play vars 11762 1726853287.60479: variable 'controller_profile' from source: play vars 11762 1726853287.60482: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853287.60492: when evaluation is False, skipping this task 11762 1726853287.60494: _execute() done 11762 1726853287.60495: dumping result to json 11762 1726853287.60497: done dumping result, returning 11762 1726853287.60499: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-d845-03d0-00000000069e] 11762 1726853287.60500: sending task result for task 02083763-bbaf-d845-03d0-00000000069e 11762 1726853287.60564: done sending task result for task 02083763-bbaf-d845-03d0-00000000069e 11762 1726853287.60567: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853287.60624: no more pending results, returning what we have 11762 1726853287.60628: results queue empty 11762 1726853287.60628: checking for any_errors_fatal 11762 1726853287.60634: done checking for any_errors_fatal 11762 1726853287.60635: checking for max_fail_percentage 11762 1726853287.60636: done checking for max_fail_percentage 11762 1726853287.60637: checking to see if all hosts have failed and the running result is not ok 11762 1726853287.60638: done checking to see if all hosts have failed 11762 1726853287.60638: getting the remaining hosts for this loop 11762 1726853287.60640: done getting the remaining hosts for this loop 11762 1726853287.60646: getting the next task for host managed_node2 11762 1726853287.60654: done getting next task for host managed_node2 11762 1726853287.60659: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11762 1726853287.60664: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853287.60683: getting variables 11762 1726853287.60685: in VariableManager get_vars() 11762 1726853287.60724: Calling all_inventory to load vars for managed_node2 11762 1726853287.60727: Calling groups_inventory to load vars for managed_node2 11762 1726853287.60729: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853287.60738: Calling all_plugins_play to load vars for managed_node2 11762 1726853287.60741: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853287.60747: Calling groups_plugins_play to load vars for managed_node2 11762 1726853287.63012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853287.66497: done with get_vars() 11762 1726853287.66521: done getting variables 11762 1726853287.66688: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:28:07 -0400 (0:00:00.153) 0:00:38.097 ****** 11762 1726853287.66726: entering _queue_task() for managed_node2/service 11762 1726853287.67607: worker is 1 (out of 1 available) 11762 1726853287.67621: exiting _queue_task() for managed_node2/service 11762 1726853287.67748: done queuing things up, now waiting for results queue to drain 11762 1726853287.67750: waiting for pending results... 11762 1726853287.68236: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11762 1726853287.68364: in run() - task 02083763-bbaf-d845-03d0-00000000069f 11762 1726853287.68389: variable 'ansible_search_path' from source: unknown 11762 1726853287.68406: variable 'ansible_search_path' from source: unknown 11762 1726853287.68476: calling self._execute() 11762 1726853287.68570: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853287.68585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853287.68624: variable 'omit' from source: magic vars 11762 1726853287.69378: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.69414: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853287.69804: variable 'network_provider' from source: set_fact 11762 1726853287.69822: variable 'network_state' from source: role '' defaults 11762 1726853287.70180: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11762 1726853287.70185: variable 'omit' from source: magic vars 11762 1726853287.70188: variable 'omit' from source: magic vars 11762 1726853287.70207: variable 'network_service_name' from source: role '' defaults 11762 1726853287.70340: variable 'network_service_name' from source: role '' defaults 11762 1726853287.70801: variable '__network_provider_setup' from source: role '' defaults 11762 1726853287.70812: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853287.71297: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853287.71300: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853287.71406: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853287.72201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853287.75792: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853287.75888: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853287.75941: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853287.75987: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853287.76029: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853287.76119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.76166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.76201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.76261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.76283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.76333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.76379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.76410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.76464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.76486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.76759: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11762 1726853287.77015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.77047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.77176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.77224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.77465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.77576: variable 'ansible_python' from source: facts 11762 1726853287.77598: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11762 1726853287.77814: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853287.77977: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853287.78345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.78380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.78466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.78655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.78659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.78778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853287.78816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853287.78848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.78916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853287.78999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853287.79264: variable 'network_connections' from source: task vars 11762 1726853287.79424: variable 'port2_profile' from source: play vars 11762 1726853287.79587: variable 'port2_profile' from source: play vars 11762 1726853287.79611: variable 'port1_profile' from source: play vars 11762 1726853287.79801: variable 'port1_profile' from source: play vars 11762 1726853287.79822: variable 'controller_profile' from source: play vars 11762 1726853287.79999: variable 'controller_profile' from source: play vars 11762 1726853287.80477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853287.80756: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853287.80855: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853287.80968: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853287.81015: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853287.81213: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853287.81294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853287.81333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853287.81492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853287.81557: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853287.82287: variable 'network_connections' from source: task vars 11762 1726853287.82347: variable 'port2_profile' from source: play vars 11762 1726853287.82510: variable 'port2_profile' from source: play vars 11762 1726853287.82567: variable 'port1_profile' from source: play vars 11762 1726853287.82725: variable 'port1_profile' from source: play vars 11762 1726853287.82741: variable 'controller_profile' from source: play vars 11762 1726853287.82925: variable 'controller_profile' from source: play vars 11762 1726853287.83030: variable '__network_packages_default_wireless' from source: role '' defaults 11762 1726853287.83250: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853287.83992: variable 'network_connections' from source: task vars 11762 1726853287.84002: variable 'port2_profile' from source: play vars 11762 1726853287.84004: variable 'port2_profile' from source: play vars 11762 1726853287.84100: variable 'port1_profile' from source: play vars 11762 1726853287.84163: variable 'port1_profile' from source: play vars 11762 1726853287.84220: variable 'controller_profile' from source: play vars 11762 1726853287.84395: variable 'controller_profile' from source: play vars 11762 1726853287.84435: variable '__network_packages_default_team' from source: role '' defaults 11762 1726853287.84668: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853287.85390: variable 'network_connections' from source: task vars 11762 1726853287.85410: variable 'port2_profile' from source: play vars 11762 1726853287.85736: variable 'port2_profile' from source: play vars 11762 1726853287.85739: variable 'port1_profile' from source: play vars 11762 1726853287.85741: variable 'port1_profile' from source: play vars 11762 1726853287.85785: variable 'controller_profile' from source: play vars 11762 1726853287.85929: variable 'controller_profile' from source: play vars 11762 1726853287.86316: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853287.86351: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853287.86394: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853287.86458: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853287.86987: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11762 1726853287.87976: variable 'network_connections' from source: task vars 11762 1726853287.87988: variable 'port2_profile' from source: play vars 11762 1726853287.88189: variable 'port2_profile' from source: play vars 11762 1726853287.88202: variable 'port1_profile' from source: play vars 11762 1726853287.88269: variable 'port1_profile' from source: play vars 11762 1726853287.88353: variable 'controller_profile' from source: play vars 11762 1726853287.88428: variable 'controller_profile' from source: play vars 11762 1726853287.88442: variable 'ansible_distribution' from source: facts 11762 1726853287.88452: variable '__network_rh_distros' from source: role '' defaults 11762 1726853287.88462: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.88487: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11762 1726853287.88931: variable 'ansible_distribution' from source: facts 11762 1726853287.88935: variable '__network_rh_distros' from source: role '' defaults 11762 1726853287.88937: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.88940: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11762 1726853287.89104: variable 'ansible_distribution' from source: facts 11762 1726853287.89113: variable '__network_rh_distros' from source: role '' defaults 11762 1726853287.89123: variable 'ansible_distribution_major_version' from source: facts 11762 1726853287.89176: variable 'network_provider' from source: set_fact 11762 1726853287.89292: variable 'omit' from source: magic vars 11762 1726853287.89326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853287.89363: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853287.89396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853287.89420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853287.89437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853287.89478: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853287.89487: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853287.89580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853287.89610: Set connection var ansible_timeout to 10 11762 1726853287.89618: Set connection var ansible_shell_type to sh 11762 1726853287.89628: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853287.89637: Set connection var ansible_shell_executable to /bin/sh 11762 1726853287.89648: Set connection var ansible_pipelining to False 11762 1726853287.89658: Set connection var ansible_connection to ssh 11762 1726853287.89696: variable 'ansible_shell_executable' from source: unknown 11762 1726853287.89704: variable 'ansible_connection' from source: unknown 11762 1726853287.89717: variable 'ansible_module_compression' from source: unknown 11762 1726853287.89725: variable 'ansible_shell_type' from source: unknown 11762 1726853287.89732: variable 'ansible_shell_executable' from source: unknown 11762 1726853287.89738: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853287.89746: variable 'ansible_pipelining' from source: unknown 11762 1726853287.89752: variable 'ansible_timeout' from source: unknown 11762 1726853287.89759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853287.89885: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853287.89908: variable 'omit' from source: magic vars 11762 1726853287.89929: starting attempt loop 11762 1726853287.89932: running the handler 11762 1726853287.90038: variable 'ansible_facts' from source: unknown 11762 1726853287.90845: _low_level_execute_command(): starting 11762 1726853287.90859: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853287.91804: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853287.91899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853287.91916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853287.92015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853287.92149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853287.92175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853287.92369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853287.94115: stdout chunk (state=3): >>>/root <<< 11762 1726853287.94304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853287.94398: stderr chunk (state=3): >>><<< 11762 1726853287.94416: stdout chunk (state=3): >>><<< 11762 1726853287.94437: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853287.94530: _low_level_execute_command(): starting 11762 1726853287.94534: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407 `" && echo ansible-tmp-1726853287.9444485-13595-97720388834407="` echo /root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407 `" ) && sleep 0' 11762 1726853287.95826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853287.95948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853287.95962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853287.96016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853287.96078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853287.96081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853287.96502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853287.98292: stdout chunk (state=3): >>>ansible-tmp-1726853287.9444485-13595-97720388834407=/root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407 <<< 11762 1726853287.98777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853287.98781: stderr chunk (state=3): >>><<< 11762 1726853287.98783: stdout chunk (state=3): >>><<< 11762 1726853287.98786: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853287.9444485-13595-97720388834407=/root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853287.98794: variable 'ansible_module_compression' from source: unknown 11762 1726853287.98797: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11762 1726853287.98799: variable 'ansible_facts' from source: unknown 11762 1726853287.99227: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407/AnsiballZ_systemd.py 11762 1726853287.99597: Sending initial data 11762 1726853287.99600: Sent initial data (155 bytes) 11762 1726853288.00893: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853288.00932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853288.01000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853288.01013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853288.01117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853288.02776: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11762 1726853288.02835: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11762 1726853288.02839: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853288.02892: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853288.02983: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpo1e2m3wf /root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407/AnsiballZ_systemd.py <<< 11762 1726853288.03002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407/AnsiballZ_systemd.py" <<< 11762 1726853288.03085: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11762 1726853288.03100: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpo1e2m3wf" to remote "/root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407/AnsiballZ_systemd.py" <<< 11762 1726853288.03114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407/AnsiballZ_systemd.py" <<< 11762 1726853288.06089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853288.06283: stderr chunk (state=3): >>><<< 11762 1726853288.06286: stdout chunk (state=3): >>><<< 11762 1726853288.06289: done transferring module to remote 11762 1726853288.06291: _low_level_execute_command(): starting 11762 1726853288.06293: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407/ /root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407/AnsiballZ_systemd.py && sleep 0' 11762 1726853288.07242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853288.07291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853288.09474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853288.09478: stdout chunk (state=3): >>><<< 11762 1726853288.09481: stderr chunk (state=3): >>><<< 11762 1726853288.09484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853288.09486: _low_level_execute_command(): starting 11762 1726853288.09489: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407/AnsiballZ_systemd.py && sleep 0' 11762 1726853288.10416: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853288.10431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853288.10451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853288.10470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853288.10570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853288.10597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853288.10616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853288.10724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853288.40570: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4444160", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304771584", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "499320000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 11762 1726853288.40603: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11762 1726853288.42617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853288.42681: stdout chunk (state=3): >>><<< 11762 1726853288.42684: stderr chunk (state=3): >>><<< 11762 1726853288.42688: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4444160", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304771584", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "499320000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853288.42907: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853288.43082: _low_level_execute_command(): starting 11762 1726853288.43087: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853287.9444485-13595-97720388834407/ > /dev/null 2>&1 && sleep 0' 11762 1726853288.43782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853288.43785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853288.43788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853288.43790: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853288.43792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853288.43848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853288.43852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853288.43876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853288.43983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853288.45926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853288.45948: stdout chunk (state=3): >>><<< 11762 1726853288.45958: stderr chunk (state=3): >>><<< 11762 1726853288.45978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853288.45988: handler run complete 11762 1726853288.46066: attempt loop complete, returning result 11762 1726853288.46074: _execute() done 11762 1726853288.46080: dumping result to json 11762 1726853288.46104: done dumping result, returning 11762 1726853288.46115: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-d845-03d0-00000000069f] 11762 1726853288.46149: sending task result for task 02083763-bbaf-d845-03d0-00000000069f 11762 1726853288.46559: done sending task result for task 02083763-bbaf-d845-03d0-00000000069f 11762 1726853288.46562: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853288.46622: no more pending results, returning what we have 11762 1726853288.46626: results queue empty 11762 1726853288.46627: checking for any_errors_fatal 11762 1726853288.46634: done checking for any_errors_fatal 11762 1726853288.46635: checking for max_fail_percentage 11762 1726853288.46637: done checking for max_fail_percentage 11762 1726853288.46638: checking to see if all hosts have failed and the running result is not ok 11762 1726853288.46638: done checking to see if all hosts have failed 11762 1726853288.46639: getting the remaining hosts for this loop 11762 1726853288.46641: done getting the remaining hosts for this loop 11762 1726853288.46647: getting the next task for host managed_node2 11762 1726853288.46656: done getting next task for host managed_node2 11762 1726853288.46659: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11762 1726853288.46665: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853288.46678: getting variables 11762 1726853288.46680: in VariableManager get_vars() 11762 1726853288.46718: Calling all_inventory to load vars for managed_node2 11762 1726853288.46721: Calling groups_inventory to load vars for managed_node2 11762 1726853288.46724: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853288.46735: Calling all_plugins_play to load vars for managed_node2 11762 1726853288.46738: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853288.46744: Calling groups_plugins_play to load vars for managed_node2 11762 1726853288.48514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853288.50062: done with get_vars() 11762 1726853288.50094: done getting variables 11762 1726853288.50158: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:28:08 -0400 (0:00:00.834) 0:00:38.932 ****** 11762 1726853288.50206: entering _queue_task() for managed_node2/service 11762 1726853288.50798: worker is 1 (out of 1 available) 11762 1726853288.50809: exiting _queue_task() for managed_node2/service 11762 1726853288.50821: done queuing things up, now waiting for results queue to drain 11762 1726853288.50822: waiting for pending results... 11762 1726853288.51093: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11762 1726853288.51099: in run() - task 02083763-bbaf-d845-03d0-0000000006a0 11762 1726853288.51103: variable 'ansible_search_path' from source: unknown 11762 1726853288.51106: variable 'ansible_search_path' from source: unknown 11762 1726853288.51109: calling self._execute() 11762 1726853288.51231: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853288.51235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853288.51238: variable 'omit' from source: magic vars 11762 1726853288.51556: variable 'ansible_distribution_major_version' from source: facts 11762 1726853288.51576: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853288.51675: variable 'network_provider' from source: set_fact 11762 1726853288.51777: Evaluated conditional (network_provider == "nm"): True 11762 1726853288.51780: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853288.51862: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853288.52038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853288.54446: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853288.54506: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853288.54580: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853288.54583: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853288.54608: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853288.54783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853288.54787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853288.54790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853288.54793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853288.54802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853288.54856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853288.54888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853288.54903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853288.54945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853288.54964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853288.55004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853288.55027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853288.55089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853288.55182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853288.55185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853288.55331: variable 'network_connections' from source: task vars 11762 1726853288.55337: variable 'port2_profile' from source: play vars 11762 1726853288.55567: variable 'port2_profile' from source: play vars 11762 1726853288.55579: variable 'port1_profile' from source: play vars 11762 1726853288.55979: variable 'port1_profile' from source: play vars 11762 1726853288.55983: variable 'controller_profile' from source: play vars 11762 1726853288.55985: variable 'controller_profile' from source: play vars 11762 1726853288.55987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853288.55992: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853288.56029: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853288.56065: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853288.56093: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853288.56133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853288.56159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853288.56184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853288.56216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853288.56264: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853288.56505: variable 'network_connections' from source: task vars 11762 1726853288.56508: variable 'port2_profile' from source: play vars 11762 1726853288.56566: variable 'port2_profile' from source: play vars 11762 1726853288.56575: variable 'port1_profile' from source: play vars 11762 1726853288.56653: variable 'port1_profile' from source: play vars 11762 1726853288.56666: variable 'controller_profile' from source: play vars 11762 1726853288.56762: variable 'controller_profile' from source: play vars 11762 1726853288.56765: Evaluated conditional (__network_wpa_supplicant_required): False 11762 1726853288.56767: when evaluation is False, skipping this task 11762 1726853288.56769: _execute() done 11762 1726853288.56774: dumping result to json 11762 1726853288.56777: done dumping result, returning 11762 1726853288.56779: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-d845-03d0-0000000006a0] 11762 1726853288.56781: sending task result for task 02083763-bbaf-d845-03d0-0000000006a0 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11762 1726853288.56982: no more pending results, returning what we have 11762 1726853288.56987: results queue empty 11762 1726853288.56988: checking for any_errors_fatal 11762 1726853288.57009: done checking for any_errors_fatal 11762 1726853288.57009: checking for max_fail_percentage 11762 1726853288.57012: done checking for max_fail_percentage 11762 1726853288.57013: checking to see if all hosts have failed and the running result is not ok 11762 1726853288.57014: done checking to see if all hosts have failed 11762 1726853288.57015: getting the remaining hosts for this loop 11762 1726853288.57017: done getting the remaining hosts for this loop 11762 1726853288.57022: getting the next task for host managed_node2 11762 1726853288.57032: done getting next task for host managed_node2 11762 1726853288.57035: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11762 1726853288.57040: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853288.57060: getting variables 11762 1726853288.57062: in VariableManager get_vars() 11762 1726853288.57106: Calling all_inventory to load vars for managed_node2 11762 1726853288.57109: Calling groups_inventory to load vars for managed_node2 11762 1726853288.57112: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853288.57121: Calling all_plugins_play to load vars for managed_node2 11762 1726853288.57124: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853288.57127: Calling groups_plugins_play to load vars for managed_node2 11762 1726853288.58019: done sending task result for task 02083763-bbaf-d845-03d0-0000000006a0 11762 1726853288.58024: WORKER PROCESS EXITING 11762 1726853288.59542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853288.62863: done with get_vars() 11762 1726853288.62897: done getting variables 11762 1726853288.62953: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:28:08 -0400 (0:00:00.131) 0:00:39.064 ****** 11762 1726853288.63393: entering _queue_task() for managed_node2/service 11762 1726853288.64157: worker is 1 (out of 1 available) 11762 1726853288.64477: exiting _queue_task() for managed_node2/service 11762 1726853288.64495: done queuing things up, now waiting for results queue to drain 11762 1726853288.64498: waiting for pending results... 11762 1726853288.64836: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 11762 1726853288.64979: in run() - task 02083763-bbaf-d845-03d0-0000000006a1 11762 1726853288.65394: variable 'ansible_search_path' from source: unknown 11762 1726853288.65398: variable 'ansible_search_path' from source: unknown 11762 1726853288.65401: calling self._execute() 11762 1726853288.65610: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853288.65614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853288.65616: variable 'omit' from source: magic vars 11762 1726853288.66601: variable 'ansible_distribution_major_version' from source: facts 11762 1726853288.66607: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853288.66780: variable 'network_provider' from source: set_fact 11762 1726853288.66860: Evaluated conditional (network_provider == "initscripts"): False 11762 1726853288.66864: when evaluation is False, skipping this task 11762 1726853288.66867: _execute() done 11762 1726853288.66869: dumping result to json 11762 1726853288.66878: done dumping result, returning 11762 1726853288.66970: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-d845-03d0-0000000006a1] 11762 1726853288.66977: sending task result for task 02083763-bbaf-d845-03d0-0000000006a1 11762 1726853288.67094: done sending task result for task 02083763-bbaf-d845-03d0-0000000006a1 11762 1726853288.67097: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853288.67226: no more pending results, returning what we have 11762 1726853288.67231: results queue empty 11762 1726853288.67232: checking for any_errors_fatal 11762 1726853288.67259: done checking for any_errors_fatal 11762 1726853288.67261: checking for max_fail_percentage 11762 1726853288.67264: done checking for max_fail_percentage 11762 1726853288.67265: checking to see if all hosts have failed and the running result is not ok 11762 1726853288.67266: done checking to see if all hosts have failed 11762 1726853288.67266: getting the remaining hosts for this loop 11762 1726853288.67269: done getting the remaining hosts for this loop 11762 1726853288.67274: getting the next task for host managed_node2 11762 1726853288.67282: done getting next task for host managed_node2 11762 1726853288.67286: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11762 1726853288.67291: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853288.67309: getting variables 11762 1726853288.67311: in VariableManager get_vars() 11762 1726853288.67468: Calling all_inventory to load vars for managed_node2 11762 1726853288.67474: Calling groups_inventory to load vars for managed_node2 11762 1726853288.67476: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853288.67486: Calling all_plugins_play to load vars for managed_node2 11762 1726853288.67489: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853288.67492: Calling groups_plugins_play to load vars for managed_node2 11762 1726853288.70236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853288.71751: done with get_vars() 11762 1726853288.71799: done getting variables 11762 1726853288.71866: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:28:08 -0400 (0:00:00.089) 0:00:39.153 ****** 11762 1726853288.72307: entering _queue_task() for managed_node2/copy 11762 1726853288.73192: worker is 1 (out of 1 available) 11762 1726853288.73205: exiting _queue_task() for managed_node2/copy 11762 1726853288.73217: done queuing things up, now waiting for results queue to drain 11762 1726853288.73218: waiting for pending results... 11762 1726853288.73896: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11762 1726853288.74225: in run() - task 02083763-bbaf-d845-03d0-0000000006a2 11762 1726853288.74253: variable 'ansible_search_path' from source: unknown 11762 1726853288.74287: variable 'ansible_search_path' from source: unknown 11762 1726853288.74426: calling self._execute() 11762 1726853288.74623: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853288.74670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853288.74863: variable 'omit' from source: magic vars 11762 1726853288.75838: variable 'ansible_distribution_major_version' from source: facts 11762 1726853288.75845: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853288.75921: variable 'network_provider' from source: set_fact 11762 1726853288.75986: Evaluated conditional (network_provider == "initscripts"): False 11762 1726853288.76062: when evaluation is False, skipping this task 11762 1726853288.76070: _execute() done 11762 1726853288.76080: dumping result to json 11762 1726853288.76091: done dumping result, returning 11762 1726853288.76106: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-d845-03d0-0000000006a2] 11762 1726853288.76119: sending task result for task 02083763-bbaf-d845-03d0-0000000006a2 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11762 1726853288.76533: no more pending results, returning what we have 11762 1726853288.76538: results queue empty 11762 1726853288.76539: checking for any_errors_fatal 11762 1726853288.76547: done checking for any_errors_fatal 11762 1726853288.76549: checking for max_fail_percentage 11762 1726853288.76555: done checking for max_fail_percentage 11762 1726853288.76555: checking to see if all hosts have failed and the running result is not ok 11762 1726853288.76556: done checking to see if all hosts have failed 11762 1726853288.76557: getting the remaining hosts for this loop 11762 1726853288.76559: done getting the remaining hosts for this loop 11762 1726853288.76563: getting the next task for host managed_node2 11762 1726853288.76573: done getting next task for host managed_node2 11762 1726853288.76578: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11762 1726853288.76584: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853288.76609: getting variables 11762 1726853288.76611: in VariableManager get_vars() 11762 1726853288.76657: Calling all_inventory to load vars for managed_node2 11762 1726853288.76660: Calling groups_inventory to load vars for managed_node2 11762 1726853288.76663: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853288.76849: Calling all_plugins_play to load vars for managed_node2 11762 1726853288.76853: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853288.76858: Calling groups_plugins_play to load vars for managed_node2 11762 1726853288.77814: done sending task result for task 02083763-bbaf-d845-03d0-0000000006a2 11762 1726853288.77819: WORKER PROCESS EXITING 11762 1726853288.79733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853288.83155: done with get_vars() 11762 1726853288.83193: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:28:08 -0400 (0:00:00.109) 0:00:39.263 ****** 11762 1726853288.83291: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11762 1726853288.83648: worker is 1 (out of 1 available) 11762 1726853288.83663: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11762 1726853288.83780: done queuing things up, now waiting for results queue to drain 11762 1726853288.83782: waiting for pending results... 11762 1726853288.84092: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11762 1726853288.84189: in run() - task 02083763-bbaf-d845-03d0-0000000006a3 11762 1726853288.84193: variable 'ansible_search_path' from source: unknown 11762 1726853288.84196: variable 'ansible_search_path' from source: unknown 11762 1726853288.84208: calling self._execute() 11762 1726853288.84405: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853288.84409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853288.84412: variable 'omit' from source: magic vars 11762 1726853288.84731: variable 'ansible_distribution_major_version' from source: facts 11762 1726853288.84740: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853288.84747: variable 'omit' from source: magic vars 11762 1726853288.84880: variable 'omit' from source: magic vars 11762 1726853288.85049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853288.88162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853288.88237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853288.88274: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853288.88311: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853288.88345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853288.88438: variable 'network_provider' from source: set_fact 11762 1726853288.88580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853288.88607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853288.88637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853288.88685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853288.88700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853288.88806: variable 'omit' from source: magic vars 11762 1726853288.88897: variable 'omit' from source: magic vars 11762 1726853288.89023: variable 'network_connections' from source: task vars 11762 1726853288.89026: variable 'port2_profile' from source: play vars 11762 1726853288.89104: variable 'port2_profile' from source: play vars 11762 1726853288.89241: variable 'port1_profile' from source: play vars 11762 1726853288.89247: variable 'port1_profile' from source: play vars 11762 1726853288.89250: variable 'controller_profile' from source: play vars 11762 1726853288.89252: variable 'controller_profile' from source: play vars 11762 1726853288.89660: variable 'omit' from source: magic vars 11762 1726853288.89669: variable '__lsr_ansible_managed' from source: task vars 11762 1726853288.90278: variable '__lsr_ansible_managed' from source: task vars 11762 1726853288.90333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11762 1726853288.90898: Loaded config def from plugin (lookup/template) 11762 1726853288.90901: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11762 1726853288.91048: File lookup term: get_ansible_managed.j2 11762 1726853288.91051: variable 'ansible_search_path' from source: unknown 11762 1726853288.91057: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11762 1726853288.91074: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11762 1726853288.91093: variable 'ansible_search_path' from source: unknown 11762 1726853288.99283: variable 'ansible_managed' from source: unknown 11762 1726853288.99418: variable 'omit' from source: magic vars 11762 1726853288.99444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853288.99476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853288.99494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853288.99510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853288.99519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853288.99546: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853288.99549: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853288.99551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853288.99640: Set connection var ansible_timeout to 10 11762 1726853288.99646: Set connection var ansible_shell_type to sh 11762 1726853288.99648: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853288.99654: Set connection var ansible_shell_executable to /bin/sh 11762 1726853288.99662: Set connection var ansible_pipelining to False 11762 1726853288.99673: Set connection var ansible_connection to ssh 11762 1726853288.99702: variable 'ansible_shell_executable' from source: unknown 11762 1726853288.99706: variable 'ansible_connection' from source: unknown 11762 1726853288.99708: variable 'ansible_module_compression' from source: unknown 11762 1726853288.99710: variable 'ansible_shell_type' from source: unknown 11762 1726853288.99713: variable 'ansible_shell_executable' from source: unknown 11762 1726853288.99715: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853288.99717: variable 'ansible_pipelining' from source: unknown 11762 1726853288.99721: variable 'ansible_timeout' from source: unknown 11762 1726853288.99732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853288.99889: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853288.99893: variable 'omit' from source: magic vars 11762 1726853288.99896: starting attempt loop 11762 1726853288.99899: running the handler 11762 1726853288.99902: _low_level_execute_command(): starting 11762 1726853288.99976: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853289.01287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853289.01292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853289.01344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853289.01398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853289.01601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853289.01744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853289.03453: stdout chunk (state=3): >>>/root <<< 11762 1726853289.03766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853289.03770: stdout chunk (state=3): >>><<< 11762 1726853289.03880: stderr chunk (state=3): >>><<< 11762 1726853289.03885: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853289.03888: _low_level_execute_command(): starting 11762 1726853289.03890: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004 `" && echo ansible-tmp-1726853289.0381625-13664-86764867745004="` echo /root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004 `" ) && sleep 0' 11762 1726853289.05083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853289.05188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853289.05390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853289.05405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853289.05586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853289.07602: stdout chunk (state=3): >>>ansible-tmp-1726853289.0381625-13664-86764867745004=/root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004 <<< 11762 1726853289.07748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853289.07760: stdout chunk (state=3): >>><<< 11762 1726853289.07778: stderr chunk (state=3): >>><<< 11762 1726853289.07802: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853289.0381625-13664-86764867745004=/root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853289.07859: variable 'ansible_module_compression' from source: unknown 11762 1726853289.08031: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11762 1726853289.08073: variable 'ansible_facts' from source: unknown 11762 1726853289.08243: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004/AnsiballZ_network_connections.py 11762 1726853289.08543: Sending initial data 11762 1726853289.08549: Sent initial data (167 bytes) 11762 1726853289.09299: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853289.09308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853289.09385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853289.09404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853289.09422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853289.09518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853289.11279: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853289.11319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853289.11426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmptf9bspjg /root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004/AnsiballZ_network_connections.py <<< 11762 1726853289.11430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004/AnsiballZ_network_connections.py" <<< 11762 1726853289.11481: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmptf9bspjg" to remote "/root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004/AnsiballZ_network_connections.py" <<< 11762 1726853289.11484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004/AnsiballZ_network_connections.py" <<< 11762 1726853289.12761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853289.12764: stdout chunk (state=3): >>><<< 11762 1726853289.12772: stderr chunk (state=3): >>><<< 11762 1726853289.12825: done transferring module to remote 11762 1726853289.12864: _low_level_execute_command(): starting 11762 1726853289.12868: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004/ /root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004/AnsiballZ_network_connections.py && sleep 0' 11762 1726853289.13587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853289.13851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853289.13864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853289.13887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853289.14308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853289.16326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853289.16330: stdout chunk (state=3): >>><<< 11762 1726853289.16332: stderr chunk (state=3): >>><<< 11762 1726853289.16335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853289.16337: _low_level_execute_command(): starting 11762 1726853289.16339: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004/AnsiballZ_network_connections.py && sleep 0' 11762 1726853289.17436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853289.17580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853289.18090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853289.18214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853289.77432: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11762 1726853289.77479: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/1e65cb07-7038-41b3-8603-a8db12da667c: error=unknown <<< 11762 1726853289.79291: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/09dac36d-59ab-47a3-bd65-311b47a40724: error=unknown <<< 11762 1726853289.81165: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back<<< 11762 1726853289.81197: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/131ea31c-3b61-4971-a483-9ea88268ee14: error=unknown <<< 11762 1726853289.81691: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11762 1726853289.83438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853289.83467: stderr chunk (state=3): >>><<< 11762 1726853289.83478: stdout chunk (state=3): >>><<< 11762 1726853289.83504: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/1e65cb07-7038-41b3-8603-a8db12da667c: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/09dac36d-59ab-47a3-bd65-311b47a40724: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zcg2f7er/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/131ea31c-3b61-4971-a483-9ea88268ee14: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853289.83555: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853289.83574: _low_level_execute_command(): starting 11762 1726853289.83585: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853289.0381625-13664-86764867745004/ > /dev/null 2>&1 && sleep 0' 11762 1726853289.84519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853289.84528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853289.84539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853289.84564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853289.84584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853289.84592: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853289.84602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853289.84616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853289.84624: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853289.84631: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853289.84891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853289.84895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853289.84897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853289.84900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853289.84902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853289.86935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853289.86939: stdout chunk (state=3): >>><<< 11762 1726853289.86951: stderr chunk (state=3): >>><<< 11762 1726853289.86963: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853289.86968: handler run complete 11762 1726853289.87006: attempt loop complete, returning result 11762 1726853289.87009: _execute() done 11762 1726853289.87012: dumping result to json 11762 1726853289.87021: done dumping result, returning 11762 1726853289.87026: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-d845-03d0-0000000006a3] 11762 1726853289.87031: sending task result for task 02083763-bbaf-d845-03d0-0000000006a3 11762 1726853289.87156: done sending task result for task 02083763-bbaf-d845-03d0-0000000006a3 11762 1726853289.87159: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11762 1726853289.87279: no more pending results, returning what we have 11762 1726853289.87283: results queue empty 11762 1726853289.87283: checking for any_errors_fatal 11762 1726853289.87290: done checking for any_errors_fatal 11762 1726853289.87291: checking for max_fail_percentage 11762 1726853289.87293: done checking for max_fail_percentage 11762 1726853289.87294: checking to see if all hosts have failed and the running result is not ok 11762 1726853289.87294: done checking to see if all hosts have failed 11762 1726853289.87295: getting the remaining hosts for this loop 11762 1726853289.87297: done getting the remaining hosts for this loop 11762 1726853289.87300: getting the next task for host managed_node2 11762 1726853289.87307: done getting next task for host managed_node2 11762 1726853289.87310: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11762 1726853289.87315: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853289.87325: getting variables 11762 1726853289.87326: in VariableManager get_vars() 11762 1726853289.87362: Calling all_inventory to load vars for managed_node2 11762 1726853289.87364: Calling groups_inventory to load vars for managed_node2 11762 1726853289.87366: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853289.87481: Calling all_plugins_play to load vars for managed_node2 11762 1726853289.87485: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853289.87489: Calling groups_plugins_play to load vars for managed_node2 11762 1726853289.88806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853289.90562: done with get_vars() 11762 1726853289.90588: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:28:09 -0400 (0:00:01.073) 0:00:40.337 ****** 11762 1726853289.90688: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11762 1726853289.91286: worker is 1 (out of 1 available) 11762 1726853289.91295: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11762 1726853289.91306: done queuing things up, now waiting for results queue to drain 11762 1726853289.91307: waiting for pending results... 11762 1726853289.91493: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 11762 1726853289.91515: in run() - task 02083763-bbaf-d845-03d0-0000000006a4 11762 1726853289.91538: variable 'ansible_search_path' from source: unknown 11762 1726853289.91545: variable 'ansible_search_path' from source: unknown 11762 1726853289.91586: calling self._execute() 11762 1726853289.91688: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853289.91708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853289.91715: variable 'omit' from source: magic vars 11762 1726853289.92143: variable 'ansible_distribution_major_version' from source: facts 11762 1726853289.92150: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853289.92293: variable 'network_state' from source: role '' defaults 11762 1726853289.92305: Evaluated conditional (network_state != {}): False 11762 1726853289.92360: when evaluation is False, skipping this task 11762 1726853289.92363: _execute() done 11762 1726853289.92366: dumping result to json 11762 1726853289.92368: done dumping result, returning 11762 1726853289.92373: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-d845-03d0-0000000006a4] 11762 1726853289.92375: sending task result for task 02083763-bbaf-d845-03d0-0000000006a4 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853289.92634: no more pending results, returning what we have 11762 1726853289.92639: results queue empty 11762 1726853289.92640: checking for any_errors_fatal 11762 1726853289.92653: done checking for any_errors_fatal 11762 1726853289.92654: checking for max_fail_percentage 11762 1726853289.92656: done checking for max_fail_percentage 11762 1726853289.92657: checking to see if all hosts have failed and the running result is not ok 11762 1726853289.92658: done checking to see if all hosts have failed 11762 1726853289.92658: getting the remaining hosts for this loop 11762 1726853289.92660: done getting the remaining hosts for this loop 11762 1726853289.92665: getting the next task for host managed_node2 11762 1726853289.92675: done getting next task for host managed_node2 11762 1726853289.92681: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11762 1726853289.92687: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853289.92709: getting variables 11762 1726853289.92710: in VariableManager get_vars() 11762 1726853289.92753: Calling all_inventory to load vars for managed_node2 11762 1726853289.92756: Calling groups_inventory to load vars for managed_node2 11762 1726853289.92759: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853289.92886: Calling all_plugins_play to load vars for managed_node2 11762 1726853289.92892: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853289.92896: Calling groups_plugins_play to load vars for managed_node2 11762 1726853289.93650: done sending task result for task 02083763-bbaf-d845-03d0-0000000006a4 11762 1726853289.93654: WORKER PROCESS EXITING 11762 1726853289.95534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853289.98686: done with get_vars() 11762 1726853289.98725: done getting variables 11762 1726853289.98790: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:28:09 -0400 (0:00:00.081) 0:00:40.419 ****** 11762 1726853289.98898: entering _queue_task() for managed_node2/debug 11762 1726853289.99380: worker is 1 (out of 1 available) 11762 1726853289.99393: exiting _queue_task() for managed_node2/debug 11762 1726853289.99404: done queuing things up, now waiting for results queue to drain 11762 1726853289.99405: waiting for pending results... 11762 1726853289.99938: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11762 1726853290.00279: in run() - task 02083763-bbaf-d845-03d0-0000000006a5 11762 1726853290.00300: variable 'ansible_search_path' from source: unknown 11762 1726853290.00306: variable 'ansible_search_path' from source: unknown 11762 1726853290.00344: calling self._execute() 11762 1726853290.00577: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.00778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.00784: variable 'omit' from source: magic vars 11762 1726853290.01473: variable 'ansible_distribution_major_version' from source: facts 11762 1726853290.01664: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853290.01668: variable 'omit' from source: magic vars 11762 1726853290.01673: variable 'omit' from source: magic vars 11762 1726853290.01805: variable 'omit' from source: magic vars 11762 1726853290.01854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853290.02079: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853290.02082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853290.02087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853290.02090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853290.02162: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853290.02165: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.02168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.02345: Set connection var ansible_timeout to 10 11762 1726853290.02349: Set connection var ansible_shell_type to sh 11762 1726853290.02351: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853290.02356: Set connection var ansible_shell_executable to /bin/sh 11762 1726853290.02365: Set connection var ansible_pipelining to False 11762 1726853290.02374: Set connection var ansible_connection to ssh 11762 1726853290.02399: variable 'ansible_shell_executable' from source: unknown 11762 1726853290.02402: variable 'ansible_connection' from source: unknown 11762 1726853290.02405: variable 'ansible_module_compression' from source: unknown 11762 1726853290.02407: variable 'ansible_shell_type' from source: unknown 11762 1726853290.02409: variable 'ansible_shell_executable' from source: unknown 11762 1726853290.02412: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.02419: variable 'ansible_pipelining' from source: unknown 11762 1726853290.02577: variable 'ansible_timeout' from source: unknown 11762 1726853290.02580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.02704: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853290.02715: variable 'omit' from source: magic vars 11762 1726853290.02721: starting attempt loop 11762 1726853290.02724: running the handler 11762 1726853290.02868: variable '__network_connections_result' from source: set_fact 11762 1726853290.02925: handler run complete 11762 1726853290.02966: attempt loop complete, returning result 11762 1726853290.02969: _execute() done 11762 1726853290.02975: dumping result to json 11762 1726853290.02977: done dumping result, returning 11762 1726853290.02980: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-d845-03d0-0000000006a5] 11762 1726853290.02982: sending task result for task 02083763-bbaf-d845-03d0-0000000006a5 11762 1726853290.03139: done sending task result for task 02083763-bbaf-d845-03d0-0000000006a5 11762 1726853290.03144: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 11762 1726853290.03237: no more pending results, returning what we have 11762 1726853290.03241: results queue empty 11762 1726853290.03242: checking for any_errors_fatal 11762 1726853290.03249: done checking for any_errors_fatal 11762 1726853290.03250: checking for max_fail_percentage 11762 1726853290.03252: done checking for max_fail_percentage 11762 1726853290.03252: checking to see if all hosts have failed and the running result is not ok 11762 1726853290.03253: done checking to see if all hosts have failed 11762 1726853290.03254: getting the remaining hosts for this loop 11762 1726853290.03255: done getting the remaining hosts for this loop 11762 1726853290.03259: getting the next task for host managed_node2 11762 1726853290.03265: done getting next task for host managed_node2 11762 1726853290.03268: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11762 1726853290.03275: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853290.03286: getting variables 11762 1726853290.03287: in VariableManager get_vars() 11762 1726853290.03324: Calling all_inventory to load vars for managed_node2 11762 1726853290.03326: Calling groups_inventory to load vars for managed_node2 11762 1726853290.03328: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853290.03336: Calling all_plugins_play to load vars for managed_node2 11762 1726853290.03339: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853290.03342: Calling groups_plugins_play to load vars for managed_node2 11762 1726853290.04989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853290.06524: done with get_vars() 11762 1726853290.06553: done getting variables 11762 1726853290.06620: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:28:10 -0400 (0:00:00.077) 0:00:40.497 ****** 11762 1726853290.06669: entering _queue_task() for managed_node2/debug 11762 1726853290.07037: worker is 1 (out of 1 available) 11762 1726853290.07050: exiting _queue_task() for managed_node2/debug 11762 1726853290.07179: done queuing things up, now waiting for results queue to drain 11762 1726853290.07181: waiting for pending results... 11762 1726853290.07389: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11762 1726853290.07578: in run() - task 02083763-bbaf-d845-03d0-0000000006a6 11762 1726853290.07582: variable 'ansible_search_path' from source: unknown 11762 1726853290.07585: variable 'ansible_search_path' from source: unknown 11762 1726853290.07629: calling self._execute() 11762 1726853290.07700: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.07706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.07736: variable 'omit' from source: magic vars 11762 1726853290.08115: variable 'ansible_distribution_major_version' from source: facts 11762 1726853290.08126: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853290.08169: variable 'omit' from source: magic vars 11762 1726853290.08212: variable 'omit' from source: magic vars 11762 1726853290.08248: variable 'omit' from source: magic vars 11762 1726853290.08294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853290.08330: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853290.08377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853290.08380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853290.08388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853290.08419: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853290.08422: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.08425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.08530: Set connection var ansible_timeout to 10 11762 1726853290.08534: Set connection var ansible_shell_type to sh 11762 1726853290.08538: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853290.08546: Set connection var ansible_shell_executable to /bin/sh 11762 1726853290.08551: Set connection var ansible_pipelining to False 11762 1726853290.08558: Set connection var ansible_connection to ssh 11762 1726853290.08580: variable 'ansible_shell_executable' from source: unknown 11762 1726853290.08588: variable 'ansible_connection' from source: unknown 11762 1726853290.08596: variable 'ansible_module_compression' from source: unknown 11762 1726853290.08674: variable 'ansible_shell_type' from source: unknown 11762 1726853290.08679: variable 'ansible_shell_executable' from source: unknown 11762 1726853290.08681: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.08684: variable 'ansible_pipelining' from source: unknown 11762 1726853290.08686: variable 'ansible_timeout' from source: unknown 11762 1726853290.08688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.08747: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853290.08755: variable 'omit' from source: magic vars 11762 1726853290.08760: starting attempt loop 11762 1726853290.08763: running the handler 11762 1726853290.08820: variable '__network_connections_result' from source: set_fact 11762 1726853290.08887: variable '__network_connections_result' from source: set_fact 11762 1726853290.09010: handler run complete 11762 1726853290.09041: attempt loop complete, returning result 11762 1726853290.09047: _execute() done 11762 1726853290.09050: dumping result to json 11762 1726853290.09052: done dumping result, returning 11762 1726853290.09148: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-d845-03d0-0000000006a6] 11762 1726853290.09150: sending task result for task 02083763-bbaf-d845-03d0-0000000006a6 11762 1726853290.09218: done sending task result for task 02083763-bbaf-d845-03d0-0000000006a6 11762 1726853290.09220: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11762 1726853290.09320: no more pending results, returning what we have 11762 1726853290.09324: results queue empty 11762 1726853290.09324: checking for any_errors_fatal 11762 1726853290.09330: done checking for any_errors_fatal 11762 1726853290.09330: checking for max_fail_percentage 11762 1726853290.09332: done checking for max_fail_percentage 11762 1726853290.09333: checking to see if all hosts have failed and the running result is not ok 11762 1726853290.09334: done checking to see if all hosts have failed 11762 1726853290.09334: getting the remaining hosts for this loop 11762 1726853290.09336: done getting the remaining hosts for this loop 11762 1726853290.09339: getting the next task for host managed_node2 11762 1726853290.09347: done getting next task for host managed_node2 11762 1726853290.09349: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11762 1726853290.09354: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853290.09365: getting variables 11762 1726853290.09367: in VariableManager get_vars() 11762 1726853290.09406: Calling all_inventory to load vars for managed_node2 11762 1726853290.09416: Calling groups_inventory to load vars for managed_node2 11762 1726853290.09420: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853290.09430: Calling all_plugins_play to load vars for managed_node2 11762 1726853290.09434: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853290.09437: Calling groups_plugins_play to load vars for managed_node2 11762 1726853290.10933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853290.12544: done with get_vars() 11762 1726853290.12576: done getting variables 11762 1726853290.12646: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:28:10 -0400 (0:00:00.060) 0:00:40.557 ****** 11762 1726853290.12687: entering _queue_task() for managed_node2/debug 11762 1726853290.13178: worker is 1 (out of 1 available) 11762 1726853290.13190: exiting _queue_task() for managed_node2/debug 11762 1726853290.13202: done queuing things up, now waiting for results queue to drain 11762 1726853290.13204: waiting for pending results... 11762 1726853290.13410: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11762 1726853290.13775: in run() - task 02083763-bbaf-d845-03d0-0000000006a7 11762 1726853290.13784: variable 'ansible_search_path' from source: unknown 11762 1726853290.13787: variable 'ansible_search_path' from source: unknown 11762 1726853290.13789: calling self._execute() 11762 1726853290.13792: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.13794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.13796: variable 'omit' from source: magic vars 11762 1726853290.14137: variable 'ansible_distribution_major_version' from source: facts 11762 1726853290.14153: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853290.14277: variable 'network_state' from source: role '' defaults 11762 1726853290.14284: Evaluated conditional (network_state != {}): False 11762 1726853290.14287: when evaluation is False, skipping this task 11762 1726853290.14290: _execute() done 11762 1726853290.14292: dumping result to json 11762 1726853290.14295: done dumping result, returning 11762 1726853290.14305: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-d845-03d0-0000000006a7] 11762 1726853290.14311: sending task result for task 02083763-bbaf-d845-03d0-0000000006a7 11762 1726853290.14409: done sending task result for task 02083763-bbaf-d845-03d0-0000000006a7 11762 1726853290.14412: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 11762 1726853290.14516: no more pending results, returning what we have 11762 1726853290.14521: results queue empty 11762 1726853290.14522: checking for any_errors_fatal 11762 1726853290.14531: done checking for any_errors_fatal 11762 1726853290.14532: checking for max_fail_percentage 11762 1726853290.14534: done checking for max_fail_percentage 11762 1726853290.14535: checking to see if all hosts have failed and the running result is not ok 11762 1726853290.14535: done checking to see if all hosts have failed 11762 1726853290.14536: getting the remaining hosts for this loop 11762 1726853290.14538: done getting the remaining hosts for this loop 11762 1726853290.14541: getting the next task for host managed_node2 11762 1726853290.14550: done getting next task for host managed_node2 11762 1726853290.14553: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11762 1726853290.14559: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853290.14579: getting variables 11762 1726853290.14581: in VariableManager get_vars() 11762 1726853290.14617: Calling all_inventory to load vars for managed_node2 11762 1726853290.14621: Calling groups_inventory to load vars for managed_node2 11762 1726853290.14623: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853290.14634: Calling all_plugins_play to load vars for managed_node2 11762 1726853290.14636: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853290.14639: Calling groups_plugins_play to load vars for managed_node2 11762 1726853290.16262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853290.17826: done with get_vars() 11762 1726853290.17857: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:28:10 -0400 (0:00:00.052) 0:00:40.610 ****** 11762 1726853290.17970: entering _queue_task() for managed_node2/ping 11762 1726853290.18336: worker is 1 (out of 1 available) 11762 1726853290.18464: exiting _queue_task() for managed_node2/ping 11762 1726853290.18479: done queuing things up, now waiting for results queue to drain 11762 1726853290.18481: waiting for pending results... 11762 1726853290.18696: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 11762 1726853290.18863: in run() - task 02083763-bbaf-d845-03d0-0000000006a8 11762 1726853290.18878: variable 'ansible_search_path' from source: unknown 11762 1726853290.18883: variable 'ansible_search_path' from source: unknown 11762 1726853290.18927: calling self._execute() 11762 1726853290.19032: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.19057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.19060: variable 'omit' from source: magic vars 11762 1726853290.19441: variable 'ansible_distribution_major_version' from source: facts 11762 1726853290.19477: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853290.19480: variable 'omit' from source: magic vars 11762 1726853290.19548: variable 'omit' from source: magic vars 11762 1726853290.19602: variable 'omit' from source: magic vars 11762 1726853290.19619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853290.19676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853290.19680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853290.19693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853290.19708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853290.19732: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853290.19734: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.19778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.19835: Set connection var ansible_timeout to 10 11762 1726853290.19838: Set connection var ansible_shell_type to sh 11762 1726853290.19846: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853290.19849: Set connection var ansible_shell_executable to /bin/sh 11762 1726853290.19856: Set connection var ansible_pipelining to False 11762 1726853290.19863: Set connection var ansible_connection to ssh 11762 1726853290.19892: variable 'ansible_shell_executable' from source: unknown 11762 1726853290.19895: variable 'ansible_connection' from source: unknown 11762 1726853290.19898: variable 'ansible_module_compression' from source: unknown 11762 1726853290.19900: variable 'ansible_shell_type' from source: unknown 11762 1726853290.19903: variable 'ansible_shell_executable' from source: unknown 11762 1726853290.19905: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.19907: variable 'ansible_pipelining' from source: unknown 11762 1726853290.19909: variable 'ansible_timeout' from source: unknown 11762 1726853290.19976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.20111: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853290.20121: variable 'omit' from source: magic vars 11762 1726853290.20126: starting attempt loop 11762 1726853290.20129: running the handler 11762 1726853290.20146: _low_level_execute_command(): starting 11762 1726853290.20149: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853290.20957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.20985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.21099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.22831: stdout chunk (state=3): >>>/root <<< 11762 1726853290.23077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853290.23081: stdout chunk (state=3): >>><<< 11762 1726853290.23085: stderr chunk (state=3): >>><<< 11762 1726853290.23089: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853290.23091: _low_level_execute_command(): starting 11762 1726853290.23093: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002 `" && echo ansible-tmp-1726853290.230137-13745-81130780945002="` echo /root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002 `" ) && sleep 0' 11762 1726853290.23652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.23657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853290.23669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853290.23686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853290.23698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853290.23705: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853290.23715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853290.23759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853290.23763: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853290.23768: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853290.23775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853290.23778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853290.23781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853290.23783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853290.23785: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853290.23868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853290.23874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853290.23877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.23906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.24020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.26015: stdout chunk (state=3): >>>ansible-tmp-1726853290.230137-13745-81130780945002=/root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002 <<< 11762 1726853290.26203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853290.26207: stdout chunk (state=3): >>><<< 11762 1726853290.26210: stderr chunk (state=3): >>><<< 11762 1726853290.26279: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853290.230137-13745-81130780945002=/root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853290.26306: variable 'ansible_module_compression' from source: unknown 11762 1726853290.26359: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11762 1726853290.26415: variable 'ansible_facts' from source: unknown 11762 1726853290.26514: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002/AnsiballZ_ping.py 11762 1726853290.26783: Sending initial data 11762 1726853290.26787: Sent initial data (151 bytes) 11762 1726853290.27376: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.27388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853290.27451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853290.27463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.27474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.27585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.29262: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853290.29352: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853290.29413: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpo4noup7w /root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002/AnsiballZ_ping.py <<< 11762 1726853290.29416: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002/AnsiballZ_ping.py" <<< 11762 1726853290.29708: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpo4noup7w" to remote "/root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002/AnsiballZ_ping.py" <<< 11762 1726853290.30572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853290.30653: stderr chunk (state=3): >>><<< 11762 1726853290.30689: stdout chunk (state=3): >>><<< 11762 1726853290.30708: done transferring module to remote 11762 1726853290.30723: _low_level_execute_command(): starting 11762 1726853290.30800: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002/ /root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002/AnsiballZ_ping.py && sleep 0' 11762 1726853290.31428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.31447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853290.31564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853290.31584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853290.31601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853290.31618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.31640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.31748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.33766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853290.33786: stdout chunk (state=3): >>><<< 11762 1726853290.33802: stderr chunk (state=3): >>><<< 11762 1726853290.33824: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853290.33832: _low_level_execute_command(): starting 11762 1726853290.33841: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002/AnsiballZ_ping.py && sleep 0' 11762 1726853290.34500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.34517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853290.34532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853290.34563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853290.34691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.34706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.34820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.50409: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11762 1726853290.51874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853290.51908: stdout chunk (state=3): >>><<< 11762 1726853290.51911: stderr chunk (state=3): >>><<< 11762 1726853290.51928: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853290.51964: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853290.51986: _low_level_execute_command(): starting 11762 1726853290.52074: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853290.230137-13745-81130780945002/ > /dev/null 2>&1 && sleep 0' 11762 1726853290.52657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.52751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853290.52792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853290.52819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.52840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.52961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.54940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853290.54947: stdout chunk (state=3): >>><<< 11762 1726853290.55179: stderr chunk (state=3): >>><<< 11762 1726853290.55183: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853290.55190: handler run complete 11762 1726853290.55192: attempt loop complete, returning result 11762 1726853290.55194: _execute() done 11762 1726853290.55195: dumping result to json 11762 1726853290.55197: done dumping result, returning 11762 1726853290.55199: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-d845-03d0-0000000006a8] 11762 1726853290.55200: sending task result for task 02083763-bbaf-d845-03d0-0000000006a8 11762 1726853290.55265: done sending task result for task 02083763-bbaf-d845-03d0-0000000006a8 11762 1726853290.55267: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 11762 1726853290.55382: no more pending results, returning what we have 11762 1726853290.55385: results queue empty 11762 1726853290.55386: checking for any_errors_fatal 11762 1726853290.55391: done checking for any_errors_fatal 11762 1726853290.55392: checking for max_fail_percentage 11762 1726853290.55395: done checking for max_fail_percentage 11762 1726853290.55395: checking to see if all hosts have failed and the running result is not ok 11762 1726853290.55396: done checking to see if all hosts have failed 11762 1726853290.55397: getting the remaining hosts for this loop 11762 1726853290.55398: done getting the remaining hosts for this loop 11762 1726853290.55401: getting the next task for host managed_node2 11762 1726853290.55410: done getting next task for host managed_node2 11762 1726853290.55412: ^ task is: TASK: meta (role_complete) 11762 1726853290.55417: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853290.55556: getting variables 11762 1726853290.55558: in VariableManager get_vars() 11762 1726853290.55599: Calling all_inventory to load vars for managed_node2 11762 1726853290.55602: Calling groups_inventory to load vars for managed_node2 11762 1726853290.55605: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853290.55613: Calling all_plugins_play to load vars for managed_node2 11762 1726853290.55616: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853290.55619: Calling groups_plugins_play to load vars for managed_node2 11762 1726853290.57137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853290.58922: done with get_vars() 11762 1726853290.58946: done getting variables 11762 1726853290.59043: done queuing things up, now waiting for results queue to drain 11762 1726853290.59045: results queue empty 11762 1726853290.59046: checking for any_errors_fatal 11762 1726853290.59049: done checking for any_errors_fatal 11762 1726853290.59050: checking for max_fail_percentage 11762 1726853290.59051: done checking for max_fail_percentage 11762 1726853290.59052: checking to see if all hosts have failed and the running result is not ok 11762 1726853290.59052: done checking to see if all hosts have failed 11762 1726853290.59053: getting the remaining hosts for this loop 11762 1726853290.59054: done getting the remaining hosts for this loop 11762 1726853290.59057: getting the next task for host managed_node2 11762 1726853290.59062: done getting next task for host managed_node2 11762 1726853290.59064: ^ task is: TASK: Delete the device '{{ controller_device }}' 11762 1726853290.59066: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853290.59069: getting variables 11762 1726853290.59070: in VariableManager get_vars() 11762 1726853290.59089: Calling all_inventory to load vars for managed_node2 11762 1726853290.59092: Calling groups_inventory to load vars for managed_node2 11762 1726853290.59094: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853290.59099: Calling all_plugins_play to load vars for managed_node2 11762 1726853290.59102: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853290.59104: Calling groups_plugins_play to load vars for managed_node2 11762 1726853290.60246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853290.61798: done with get_vars() 11762 1726853290.61833: done getting variables 11762 1726853290.61885: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853290.62012: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Friday 20 September 2024 13:28:10 -0400 (0:00:00.440) 0:00:41.050 ****** 11762 1726853290.62042: entering _queue_task() for managed_node2/command 11762 1726853290.62445: worker is 1 (out of 1 available) 11762 1726853290.62458: exiting _queue_task() for managed_node2/command 11762 1726853290.62620: done queuing things up, now waiting for results queue to drain 11762 1726853290.62622: waiting for pending results... 11762 1726853290.62777: running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' 11762 1726853290.62910: in run() - task 02083763-bbaf-d845-03d0-0000000006d8 11762 1726853290.62932: variable 'ansible_search_path' from source: unknown 11762 1726853290.62945: variable 'ansible_search_path' from source: unknown 11762 1726853290.62993: calling self._execute() 11762 1726853290.63107: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.63119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.63130: variable 'omit' from source: magic vars 11762 1726853290.63538: variable 'ansible_distribution_major_version' from source: facts 11762 1726853290.63557: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853290.63569: variable 'omit' from source: magic vars 11762 1726853290.63605: variable 'omit' from source: magic vars 11762 1726853290.63716: variable 'controller_device' from source: play vars 11762 1726853290.63741: variable 'omit' from source: magic vars 11762 1726853290.63792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853290.63843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853290.63874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853290.63898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853290.63916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853290.64076: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853290.64079: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.64082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.64090: Set connection var ansible_timeout to 10 11762 1726853290.64097: Set connection var ansible_shell_type to sh 11762 1726853290.64144: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853290.64147: Set connection var ansible_shell_executable to /bin/sh 11762 1726853290.64150: Set connection var ansible_pipelining to False 11762 1726853290.64152: Set connection var ansible_connection to ssh 11762 1726853290.64174: variable 'ansible_shell_executable' from source: unknown 11762 1726853290.64184: variable 'ansible_connection' from source: unknown 11762 1726853290.64191: variable 'ansible_module_compression' from source: unknown 11762 1726853290.64198: variable 'ansible_shell_type' from source: unknown 11762 1726853290.64205: variable 'ansible_shell_executable' from source: unknown 11762 1726853290.64253: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853290.64256: variable 'ansible_pipelining' from source: unknown 11762 1726853290.64259: variable 'ansible_timeout' from source: unknown 11762 1726853290.64261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853290.64400: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853290.64421: variable 'omit' from source: magic vars 11762 1726853290.64432: starting attempt loop 11762 1726853290.64440: running the handler 11762 1726853290.64461: _low_level_execute_command(): starting 11762 1726853290.64578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853290.65240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.65291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853290.65306: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853290.65359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853290.65404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853290.65434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.65481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.65566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.67338: stdout chunk (state=3): >>>/root <<< 11762 1726853290.67504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853290.67507: stdout chunk (state=3): >>><<< 11762 1726853290.67514: stderr chunk (state=3): >>><<< 11762 1726853290.67537: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853290.67645: _low_level_execute_command(): starting 11762 1726853290.67652: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780 `" && echo ansible-tmp-1726853290.6754951-13763-205741766084780="` echo /root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780 `" ) && sleep 0' 11762 1726853290.68222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.68289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853290.68360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853290.68381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.68401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.68512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.70573: stdout chunk (state=3): >>>ansible-tmp-1726853290.6754951-13763-205741766084780=/root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780 <<< 11762 1726853290.70725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853290.70738: stderr chunk (state=3): >>><<< 11762 1726853290.70747: stdout chunk (state=3): >>><<< 11762 1726853290.70983: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853290.6754951-13763-205741766084780=/root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853290.70986: variable 'ansible_module_compression' from source: unknown 11762 1726853290.70989: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853290.70992: variable 'ansible_facts' from source: unknown 11762 1726853290.71006: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780/AnsiballZ_command.py 11762 1726853290.71216: Sending initial data 11762 1726853290.71226: Sent initial data (156 bytes) 11762 1726853290.71814: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.71829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853290.71843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853290.71866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853290.71884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853290.71894: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853290.71977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853290.72003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853290.72018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.72035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.72133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.73923: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11762 1726853290.73965: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853290.74023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853290.74117: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp1u5_bjgy /root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780/AnsiballZ_command.py <<< 11762 1726853290.74126: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780/AnsiballZ_command.py" <<< 11762 1726853290.74188: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp1u5_bjgy" to remote "/root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780/AnsiballZ_command.py" <<< 11762 1726853290.75335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853290.75339: stdout chunk (state=3): >>><<< 11762 1726853290.75341: stderr chunk (state=3): >>><<< 11762 1726853290.75347: done transferring module to remote 11762 1726853290.75349: _low_level_execute_command(): starting 11762 1726853290.75352: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780/ /root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780/AnsiballZ_command.py && sleep 0' 11762 1726853290.75904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.75920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853290.75937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853290.75958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853290.75978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853290.75992: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853290.76006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853290.76025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853290.76039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853290.76051: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853290.76063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853290.76088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853290.76110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853290.76191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853290.76208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.76232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.76345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.78294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853290.78318: stderr chunk (state=3): >>><<< 11762 1726853290.78334: stdout chunk (state=3): >>><<< 11762 1726853290.78363: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853290.78378: _low_level_execute_command(): starting 11762 1726853290.78389: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780/AnsiballZ_command.py && sleep 0' 11762 1726853290.79023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.79038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853290.79052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853290.79070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853290.79090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853290.79138: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853290.79201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853290.79219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.79246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.79369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853290.96048: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 13:28:10.950322", "end": "2024-09-20 13:28:10.959282", "delta": "0:00:00.008960", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853290.97664: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 11762 1726853290.97679: stdout chunk (state=3): >>><<< 11762 1726853290.97755: stderr chunk (state=3): >>><<< 11762 1726853290.97758: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 13:28:10.950322", "end": "2024-09-20 13:28:10.959282", "delta": "0:00:00.008960", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 11762 1726853290.97761: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853290.97784: _low_level_execute_command(): starting 11762 1726853290.97794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853290.6754951-13763-205741766084780/ > /dev/null 2>&1 && sleep 0' 11762 1726853290.98444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853290.98460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853290.98484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853290.98500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853290.98860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853290.98864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853290.98894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853290.99005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.00918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.00981: stderr chunk (state=3): >>><<< 11762 1726853291.00985: stdout chunk (state=3): >>><<< 11762 1726853291.01176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853291.01179: handler run complete 11762 1726853291.01181: Evaluated conditional (False): False 11762 1726853291.01183: Evaluated conditional (False): False 11762 1726853291.01188: attempt loop complete, returning result 11762 1726853291.01193: _execute() done 11762 1726853291.01195: dumping result to json 11762 1726853291.01197: done dumping result, returning 11762 1726853291.01199: done running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' [02083763-bbaf-d845-03d0-0000000006d8] 11762 1726853291.01200: sending task result for task 02083763-bbaf-d845-03d0-0000000006d8 11762 1726853291.01266: done sending task result for task 02083763-bbaf-d845-03d0-0000000006d8 11762 1726853291.01268: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.008960", "end": "2024-09-20 13:28:10.959282", "failed_when_result": false, "rc": 1, "start": "2024-09-20 13:28:10.950322" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11762 1726853291.01366: no more pending results, returning what we have 11762 1726853291.01370: results queue empty 11762 1726853291.01477: checking for any_errors_fatal 11762 1726853291.01480: done checking for any_errors_fatal 11762 1726853291.01481: checking for max_fail_percentage 11762 1726853291.01483: done checking for max_fail_percentage 11762 1726853291.01484: checking to see if all hosts have failed and the running result is not ok 11762 1726853291.01485: done checking to see if all hosts have failed 11762 1726853291.01486: getting the remaining hosts for this loop 11762 1726853291.01487: done getting the remaining hosts for this loop 11762 1726853291.01491: getting the next task for host managed_node2 11762 1726853291.01500: done getting next task for host managed_node2 11762 1726853291.01502: ^ task is: TASK: Remove test interfaces 11762 1726853291.01505: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853291.01509: getting variables 11762 1726853291.01510: in VariableManager get_vars() 11762 1726853291.01547: Calling all_inventory to load vars for managed_node2 11762 1726853291.01549: Calling groups_inventory to load vars for managed_node2 11762 1726853291.01551: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853291.01561: Calling all_plugins_play to load vars for managed_node2 11762 1726853291.01563: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853291.01566: Calling groups_plugins_play to load vars for managed_node2 11762 1726853291.04082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853291.06933: done with get_vars() 11762 1726853291.07131: done getting variables 11762 1726853291.07207: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:28:11 -0400 (0:00:00.454) 0:00:41.505 ****** 11762 1726853291.07484: entering _queue_task() for managed_node2/shell 11762 1726853291.07906: worker is 1 (out of 1 available) 11762 1726853291.07922: exiting _queue_task() for managed_node2/shell 11762 1726853291.08057: done queuing things up, now waiting for results queue to drain 11762 1726853291.08059: waiting for pending results... 11762 1726853291.08395: running TaskExecutor() for managed_node2/TASK: Remove test interfaces 11762 1726853291.08515: in run() - task 02083763-bbaf-d845-03d0-0000000006de 11762 1726853291.08532: variable 'ansible_search_path' from source: unknown 11762 1726853291.08536: variable 'ansible_search_path' from source: unknown 11762 1726853291.08582: calling self._execute() 11762 1726853291.08685: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853291.08692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853291.08703: variable 'omit' from source: magic vars 11762 1726853291.09096: variable 'ansible_distribution_major_version' from source: facts 11762 1726853291.09115: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853291.09120: variable 'omit' from source: magic vars 11762 1726853291.09179: variable 'omit' from source: magic vars 11762 1726853291.09330: variable 'dhcp_interface1' from source: play vars 11762 1726853291.09335: variable 'dhcp_interface2' from source: play vars 11762 1726853291.09378: variable 'omit' from source: magic vars 11762 1726853291.09395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853291.09433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853291.09463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853291.09478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853291.09487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853291.09576: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853291.09580: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853291.09582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853291.09616: Set connection var ansible_timeout to 10 11762 1726853291.09619: Set connection var ansible_shell_type to sh 11762 1726853291.09624: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853291.09630: Set connection var ansible_shell_executable to /bin/sh 11762 1726853291.09637: Set connection var ansible_pipelining to False 11762 1726853291.09651: Set connection var ansible_connection to ssh 11762 1726853291.09673: variable 'ansible_shell_executable' from source: unknown 11762 1726853291.09676: variable 'ansible_connection' from source: unknown 11762 1726853291.09678: variable 'ansible_module_compression' from source: unknown 11762 1726853291.09681: variable 'ansible_shell_type' from source: unknown 11762 1726853291.09683: variable 'ansible_shell_executable' from source: unknown 11762 1726853291.09685: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853291.09696: variable 'ansible_pipelining' from source: unknown 11762 1726853291.09776: variable 'ansible_timeout' from source: unknown 11762 1726853291.09780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853291.09830: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853291.09847: variable 'omit' from source: magic vars 11762 1726853291.09852: starting attempt loop 11762 1726853291.09855: running the handler 11762 1726853291.09873: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853291.09890: _low_level_execute_command(): starting 11762 1726853291.09897: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853291.10635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853291.10687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853291.10740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853291.10794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.10797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.10902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.12625: stdout chunk (state=3): >>>/root <<< 11762 1726853291.12781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.12785: stdout chunk (state=3): >>><<< 11762 1726853291.12788: stderr chunk (state=3): >>><<< 11762 1726853291.12920: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853291.12924: _low_level_execute_command(): starting 11762 1726853291.12928: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660 `" && echo ansible-tmp-1726853291.1281803-13789-159928975575660="` echo /root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660 `" ) && sleep 0' 11762 1726853291.13489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853291.13513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853291.13539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853291.13630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853291.13661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853291.13680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.13707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.13815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.16038: stdout chunk (state=3): >>>ansible-tmp-1726853291.1281803-13789-159928975575660=/root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660 <<< 11762 1726853291.16198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.16202: stdout chunk (state=3): >>><<< 11762 1726853291.16204: stderr chunk (state=3): >>><<< 11762 1726853291.16224: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853291.1281803-13789-159928975575660=/root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853291.16376: variable 'ansible_module_compression' from source: unknown 11762 1726853291.16379: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853291.16382: variable 'ansible_facts' from source: unknown 11762 1726853291.16450: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660/AnsiballZ_command.py 11762 1726853291.16623: Sending initial data 11762 1726853291.16633: Sent initial data (156 bytes) 11762 1726853291.17237: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853291.17253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853291.17285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853291.17304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853291.17394: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853291.17414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853291.17434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.17459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.17568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.19315: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853291.19384: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853291.19449: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpahiicxa4 /root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660/AnsiballZ_command.py <<< 11762 1726853291.19463: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660/AnsiballZ_command.py" <<< 11762 1726853291.19526: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpahiicxa4" to remote "/root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660/AnsiballZ_command.py" <<< 11762 1726853291.20431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.20495: stderr chunk (state=3): >>><<< 11762 1726853291.20614: stdout chunk (state=3): >>><<< 11762 1726853291.20617: done transferring module to remote 11762 1726853291.20621: _low_level_execute_command(): starting 11762 1726853291.20623: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660/ /root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660/AnsiballZ_command.py && sleep 0' 11762 1726853291.21170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853291.21189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853291.21203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853291.21221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853291.21238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853291.21260: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853291.21369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853291.21375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853291.21397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.21413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.21521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.23842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.23847: stdout chunk (state=3): >>><<< 11762 1726853291.23849: stderr chunk (state=3): >>><<< 11762 1726853291.23852: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853291.23854: _low_level_execute_command(): starting 11762 1726853291.23856: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660/AnsiballZ_command.py && sleep 0' 11762 1726853291.24379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853291.24394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853291.24407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853291.24423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853291.24437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853291.24448: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853291.24462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853291.24491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853291.24579: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.24785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.25041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.45127: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:28:11.407181", "end": "2024-09-20 13:28:11.450068", "delta": "0:00:00.042887", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853291.46786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.46800: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 11762 1726853291.46852: stderr chunk (state=3): >>><<< 11762 1726853291.46861: stdout chunk (state=3): >>><<< 11762 1726853291.46884: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:28:11.407181", "end": "2024-09-20 13:28:11.450068", "delta": "0:00:00.042887", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853291.46932: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853291.46939: _low_level_execute_command(): starting 11762 1726853291.46946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853291.1281803-13789-159928975575660/ > /dev/null 2>&1 && sleep 0' 11762 1726853291.47593: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853291.47597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853291.47602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853291.47617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853291.47630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853291.47637: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853291.47650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853291.47664: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853291.47675: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853291.47682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853291.47769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853291.47779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853291.47809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.47813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.47928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.49876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.49880: stderr chunk (state=3): >>><<< 11762 1726853291.49883: stdout chunk (state=3): >>><<< 11762 1726853291.49904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853291.49911: handler run complete 11762 1726853291.49937: Evaluated conditional (False): False 11762 1726853291.49949: attempt loop complete, returning result 11762 1726853291.49952: _execute() done 11762 1726853291.49954: dumping result to json 11762 1726853291.49959: done dumping result, returning 11762 1726853291.49967: done running TaskExecutor() for managed_node2/TASK: Remove test interfaces [02083763-bbaf-d845-03d0-0000000006de] 11762 1726853291.49986: sending task result for task 02083763-bbaf-d845-03d0-0000000006de ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.042887", "end": "2024-09-20 13:28:11.450068", "rc": 0, "start": "2024-09-20 13:28:11.407181" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11762 1726853291.50165: no more pending results, returning what we have 11762 1726853291.50168: results queue empty 11762 1726853291.50169: checking for any_errors_fatal 11762 1726853291.50187: done checking for any_errors_fatal 11762 1726853291.50188: checking for max_fail_percentage 11762 1726853291.50190: done checking for max_fail_percentage 11762 1726853291.50191: checking to see if all hosts have failed and the running result is not ok 11762 1726853291.50192: done checking to see if all hosts have failed 11762 1726853291.50193: getting the remaining hosts for this loop 11762 1726853291.50195: done getting the remaining hosts for this loop 11762 1726853291.50199: getting the next task for host managed_node2 11762 1726853291.50206: done getting next task for host managed_node2 11762 1726853291.50209: ^ task is: TASK: Stop dnsmasq/radvd services 11762 1726853291.50213: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853291.50218: getting variables 11762 1726853291.50220: in VariableManager get_vars() 11762 1726853291.50265: Calling all_inventory to load vars for managed_node2 11762 1726853291.50268: Calling groups_inventory to load vars for managed_node2 11762 1726853291.50480: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853291.50497: Calling all_plugins_play to load vars for managed_node2 11762 1726853291.50501: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853291.50505: Calling groups_plugins_play to load vars for managed_node2 11762 1726853291.51262: done sending task result for task 02083763-bbaf-d845-03d0-0000000006de 11762 1726853291.51268: WORKER PROCESS EXITING 11762 1726853291.52257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853291.53915: done with get_vars() 11762 1726853291.53957: done getting variables 11762 1726853291.54024: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 13:28:11 -0400 (0:00:00.465) 0:00:41.971 ****** 11762 1726853291.54061: entering _queue_task() for managed_node2/shell 11762 1726853291.54600: worker is 1 (out of 1 available) 11762 1726853291.54611: exiting _queue_task() for managed_node2/shell 11762 1726853291.54623: done queuing things up, now waiting for results queue to drain 11762 1726853291.54624: waiting for pending results... 11762 1726853291.54773: running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services 11762 1726853291.54869: in run() - task 02083763-bbaf-d845-03d0-0000000006df 11762 1726853291.54887: variable 'ansible_search_path' from source: unknown 11762 1726853291.54891: variable 'ansible_search_path' from source: unknown 11762 1726853291.54921: calling self._execute() 11762 1726853291.55031: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853291.55076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853291.55080: variable 'omit' from source: magic vars 11762 1726853291.55456: variable 'ansible_distribution_major_version' from source: facts 11762 1726853291.55468: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853291.55476: variable 'omit' from source: magic vars 11762 1726853291.55535: variable 'omit' from source: magic vars 11762 1726853291.55566: variable 'omit' from source: magic vars 11762 1726853291.55625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853291.55674: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853291.55677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853291.55693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853291.55769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853291.55774: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853291.55776: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853291.55779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853291.55864: Set connection var ansible_timeout to 10 11762 1726853291.55867: Set connection var ansible_shell_type to sh 11762 1726853291.55876: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853291.55879: Set connection var ansible_shell_executable to /bin/sh 11762 1726853291.55887: Set connection var ansible_pipelining to False 11762 1726853291.55894: Set connection var ansible_connection to ssh 11762 1726853291.55918: variable 'ansible_shell_executable' from source: unknown 11762 1726853291.55921: variable 'ansible_connection' from source: unknown 11762 1726853291.55924: variable 'ansible_module_compression' from source: unknown 11762 1726853291.55926: variable 'ansible_shell_type' from source: unknown 11762 1726853291.55928: variable 'ansible_shell_executable' from source: unknown 11762 1726853291.55930: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853291.55945: variable 'ansible_pipelining' from source: unknown 11762 1726853291.55948: variable 'ansible_timeout' from source: unknown 11762 1726853291.55986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853291.56111: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853291.56128: variable 'omit' from source: magic vars 11762 1726853291.56131: starting attempt loop 11762 1726853291.56133: running the handler 11762 1726853291.56143: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853291.56173: _low_level_execute_command(): starting 11762 1726853291.56184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853291.56976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853291.57033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853291.57042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853291.57057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.57079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.57185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.58956: stdout chunk (state=3): >>>/root <<< 11762 1726853291.59096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.59118: stderr chunk (state=3): >>><<< 11762 1726853291.59133: stdout chunk (state=3): >>><<< 11762 1726853291.59246: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853291.59250: _low_level_execute_command(): starting 11762 1726853291.59252: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724 `" && echo ansible-tmp-1726853291.5915823-13817-30173486504724="` echo /root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724 `" ) && sleep 0' 11762 1726853291.59782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853291.59803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853291.59824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853291.59855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853291.59929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853291.59976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853291.60002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.60037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.60142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.62164: stdout chunk (state=3): >>>ansible-tmp-1726853291.5915823-13817-30173486504724=/root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724 <<< 11762 1726853291.62332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.62336: stdout chunk (state=3): >>><<< 11762 1726853291.62338: stderr chunk (state=3): >>><<< 11762 1726853291.62478: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853291.5915823-13817-30173486504724=/root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853291.62482: variable 'ansible_module_compression' from source: unknown 11762 1726853291.62484: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853291.62502: variable 'ansible_facts' from source: unknown 11762 1726853291.62591: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724/AnsiballZ_command.py 11762 1726853291.62845: Sending initial data 11762 1726853291.62848: Sent initial data (155 bytes) 11762 1726853291.63426: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853291.63437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853291.63450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853291.63474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853291.63492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853291.63592: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.63611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.63716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.65420: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11762 1726853291.65463: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853291.65546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853291.65634: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpapvr5h4_ /root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724/AnsiballZ_command.py <<< 11762 1726853291.65637: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724/AnsiballZ_command.py" <<< 11762 1726853291.65702: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpapvr5h4_" to remote "/root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724/AnsiballZ_command.py" <<< 11762 1726853291.66686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.66868: stderr chunk (state=3): >>><<< 11762 1726853291.66877: stdout chunk (state=3): >>><<< 11762 1726853291.66880: done transferring module to remote 11762 1726853291.66883: _low_level_execute_command(): starting 11762 1726853291.66886: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724/ /root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724/AnsiballZ_command.py && sleep 0' 11762 1726853291.67677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853291.67730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853291.67751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.67782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.67904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.69841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.69855: stdout chunk (state=3): >>><<< 11762 1726853291.69867: stderr chunk (state=3): >>><<< 11762 1726853291.69888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853291.69895: _low_level_execute_command(): starting 11762 1726853291.69904: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724/AnsiballZ_command.py && sleep 0' 11762 1726853291.70580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853291.70969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853291.70991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853291.71005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.71113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.89697: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:28:11.867831", "end": "2024-09-20 13:28:11.895299", "delta": "0:00:00.027468", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853291.91378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853291.91393: stdout chunk (state=3): >>><<< 11762 1726853291.91411: stderr chunk (state=3): >>><<< 11762 1726853291.91457: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:28:11.867831", "end": "2024-09-20 13:28:11.895299", "delta": "0:00:00.027468", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853291.91517: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853291.91553: _low_level_execute_command(): starting 11762 1726853291.91563: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853291.5915823-13817-30173486504724/ > /dev/null 2>&1 && sleep 0' 11762 1726853291.92568: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853291.92589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853291.92606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853291.92626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853291.92866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853291.92893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853291.92994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853291.94988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853291.95000: stdout chunk (state=3): >>><<< 11762 1726853291.95290: stderr chunk (state=3): >>><<< 11762 1726853291.95294: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853291.95297: handler run complete 11762 1726853291.95299: Evaluated conditional (False): False 11762 1726853291.95301: attempt loop complete, returning result 11762 1726853291.95304: _execute() done 11762 1726853291.95305: dumping result to json 11762 1726853291.95308: done dumping result, returning 11762 1726853291.95310: done running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services [02083763-bbaf-d845-03d0-0000000006df] 11762 1726853291.95312: sending task result for task 02083763-bbaf-d845-03d0-0000000006df 11762 1726853291.95389: done sending task result for task 02083763-bbaf-d845-03d0-0000000006df ok: [managed_node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.027468", "end": "2024-09-20 13:28:11.895299", "rc": 0, "start": "2024-09-20 13:28:11.867831" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11762 1726853291.95462: no more pending results, returning what we have 11762 1726853291.95466: results queue empty 11762 1726853291.95466: checking for any_errors_fatal 11762 1726853291.95491: done checking for any_errors_fatal 11762 1726853291.95492: checking for max_fail_percentage 11762 1726853291.95495: done checking for max_fail_percentage 11762 1726853291.95496: checking to see if all hosts have failed and the running result is not ok 11762 1726853291.95496: done checking to see if all hosts have failed 11762 1726853291.95497: getting the remaining hosts for this loop 11762 1726853291.95499: done getting the remaining hosts for this loop 11762 1726853291.95502: getting the next task for host managed_node2 11762 1726853291.95515: done getting next task for host managed_node2 11762 1726853291.95517: ^ task is: TASK: Reset bond options to assert 11762 1726853291.95520: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853291.95524: getting variables 11762 1726853291.95526: in VariableManager get_vars() 11762 1726853291.95569: Calling all_inventory to load vars for managed_node2 11762 1726853291.95964: Calling groups_inventory to load vars for managed_node2 11762 1726853291.95968: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853291.95982: Calling all_plugins_play to load vars for managed_node2 11762 1726853291.95985: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853291.95990: Calling groups_plugins_play to load vars for managed_node2 11762 1726853291.96562: WORKER PROCESS EXITING 11762 1726853292.02601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853292.04131: done with get_vars() 11762 1726853292.04165: done getting variables 11762 1726853292.04231: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Reset bond options to assert] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:59 Friday 20 September 2024 13:28:12 -0400 (0:00:00.501) 0:00:42.473 ****** 11762 1726853292.04257: entering _queue_task() for managed_node2/set_fact 11762 1726853292.04624: worker is 1 (out of 1 available) 11762 1726853292.04637: exiting _queue_task() for managed_node2/set_fact 11762 1726853292.04651: done queuing things up, now waiting for results queue to drain 11762 1726853292.04653: waiting for pending results... 11762 1726853292.05001: running TaskExecutor() for managed_node2/TASK: Reset bond options to assert 11762 1726853292.05096: in run() - task 02083763-bbaf-d845-03d0-00000000000f 11762 1726853292.05100: variable 'ansible_search_path' from source: unknown 11762 1726853292.05204: calling self._execute() 11762 1726853292.05252: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.05264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.05279: variable 'omit' from source: magic vars 11762 1726853292.05691: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.05709: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.05721: variable 'omit' from source: magic vars 11762 1726853292.05758: variable 'omit' from source: magic vars 11762 1726853292.05803: variable 'dhcp_interface1' from source: play vars 11762 1726853292.05879: variable 'dhcp_interface1' from source: play vars 11762 1726853292.05902: variable 'omit' from source: magic vars 11762 1726853292.05966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853292.05997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853292.06023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853292.06076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.06080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.06103: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853292.06114: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.06121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.06229: Set connection var ansible_timeout to 10 11762 1726853292.06294: Set connection var ansible_shell_type to sh 11762 1726853292.06297: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853292.06299: Set connection var ansible_shell_executable to /bin/sh 11762 1726853292.06302: Set connection var ansible_pipelining to False 11762 1726853292.06306: Set connection var ansible_connection to ssh 11762 1726853292.06308: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.06311: variable 'ansible_connection' from source: unknown 11762 1726853292.06316: variable 'ansible_module_compression' from source: unknown 11762 1726853292.06324: variable 'ansible_shell_type' from source: unknown 11762 1726853292.06331: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.06402: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.06406: variable 'ansible_pipelining' from source: unknown 11762 1726853292.06408: variable 'ansible_timeout' from source: unknown 11762 1726853292.06410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.06502: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853292.06525: variable 'omit' from source: magic vars 11762 1726853292.06544: starting attempt loop 11762 1726853292.06621: running the handler 11762 1726853292.06625: handler run complete 11762 1726853292.06627: attempt loop complete, returning result 11762 1726853292.06629: _execute() done 11762 1726853292.06632: dumping result to json 11762 1726853292.06634: done dumping result, returning 11762 1726853292.06636: done running TaskExecutor() for managed_node2/TASK: Reset bond options to assert [02083763-bbaf-d845-03d0-00000000000f] 11762 1726853292.06639: sending task result for task 02083763-bbaf-d845-03d0-00000000000f ok: [managed_node2] => { "ansible_facts": { "bond_options_to_assert": [ { "key": "mode", "value": "active-backup" }, { "key": "arp_interval", "value": "60" }, { "key": "arp_ip_target", "value": "192.0.2.128" }, { "key": "arp_validate", "value": "none" }, { "key": "primary", "value": "test1" } ] }, "changed": false } 11762 1726853292.06849: no more pending results, returning what we have 11762 1726853292.06854: results queue empty 11762 1726853292.06855: checking for any_errors_fatal 11762 1726853292.06866: done checking for any_errors_fatal 11762 1726853292.06867: checking for max_fail_percentage 11762 1726853292.06869: done checking for max_fail_percentage 11762 1726853292.06870: checking to see if all hosts have failed and the running result is not ok 11762 1726853292.06872: done checking to see if all hosts have failed 11762 1726853292.06873: getting the remaining hosts for this loop 11762 1726853292.06876: done getting the remaining hosts for this loop 11762 1726853292.06880: getting the next task for host managed_node2 11762 1726853292.06888: done getting next task for host managed_node2 11762 1726853292.06891: ^ task is: TASK: Include the task 'run_test.yml' 11762 1726853292.06893: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853292.06897: getting variables 11762 1726853292.06898: in VariableManager get_vars() 11762 1726853292.06939: Calling all_inventory to load vars for managed_node2 11762 1726853292.06942: Calling groups_inventory to load vars for managed_node2 11762 1726853292.06944: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853292.06955: Calling all_plugins_play to load vars for managed_node2 11762 1726853292.06958: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853292.06960: Calling groups_plugins_play to load vars for managed_node2 11762 1726853292.07688: done sending task result for task 02083763-bbaf-d845-03d0-00000000000f 11762 1726853292.07692: WORKER PROCESS EXITING 11762 1726853292.08759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853292.11516: done with get_vars() 11762 1726853292.11592: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:72 Friday 20 September 2024 13:28:12 -0400 (0:00:00.075) 0:00:42.548 ****** 11762 1726853292.11880: entering _queue_task() for managed_node2/include_tasks 11762 1726853292.12331: worker is 1 (out of 1 available) 11762 1726853292.12490: exiting _queue_task() for managed_node2/include_tasks 11762 1726853292.12503: done queuing things up, now waiting for results queue to drain 11762 1726853292.12505: waiting for pending results... 11762 1726853292.12897: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 11762 1726853292.13200: in run() - task 02083763-bbaf-d845-03d0-000000000011 11762 1726853292.13222: variable 'ansible_search_path' from source: unknown 11762 1726853292.13255: calling self._execute() 11762 1726853292.13479: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.13482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.13486: variable 'omit' from source: magic vars 11762 1726853292.14442: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.14447: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.14450: _execute() done 11762 1726853292.14453: dumping result to json 11762 1726853292.14455: done dumping result, returning 11762 1726853292.14458: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [02083763-bbaf-d845-03d0-000000000011] 11762 1726853292.14461: sending task result for task 02083763-bbaf-d845-03d0-000000000011 11762 1726853292.14695: no more pending results, returning what we have 11762 1726853292.14701: in VariableManager get_vars() 11762 1726853292.14752: Calling all_inventory to load vars for managed_node2 11762 1726853292.14755: Calling groups_inventory to load vars for managed_node2 11762 1726853292.14758: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853292.14768: done sending task result for task 02083763-bbaf-d845-03d0-000000000011 11762 1726853292.14773: WORKER PROCESS EXITING 11762 1726853292.14881: Calling all_plugins_play to load vars for managed_node2 11762 1726853292.14885: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853292.14887: Calling groups_plugins_play to load vars for managed_node2 11762 1726853292.19461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853292.23808: done with get_vars() 11762 1726853292.23836: variable 'ansible_search_path' from source: unknown 11762 1726853292.23857: we have included files to process 11762 1726853292.23858: generating all_blocks data 11762 1726853292.23863: done generating all_blocks data 11762 1726853292.23868: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11762 1726853292.23869: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11762 1726853292.23875: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11762 1726853292.24719: in VariableManager get_vars() 11762 1726853292.24745: done with get_vars() 11762 1726853292.24991: in VariableManager get_vars() 11762 1726853292.25011: done with get_vars() 11762 1726853292.25054: in VariableManager get_vars() 11762 1726853292.25076: done with get_vars() 11762 1726853292.25117: in VariableManager get_vars() 11762 1726853292.25137: done with get_vars() 11762 1726853292.25384: in VariableManager get_vars() 11762 1726853292.25404: done with get_vars() 11762 1726853292.26204: in VariableManager get_vars() 11762 1726853292.26223: done with get_vars() 11762 1726853292.26235: done processing included file 11762 1726853292.26237: iterating over new_blocks loaded from include file 11762 1726853292.26238: in VariableManager get_vars() 11762 1726853292.26253: done with get_vars() 11762 1726853292.26255: filtering new block on tags 11762 1726853292.26345: done filtering new block on tags 11762 1726853292.26348: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 11762 1726853292.26353: extending task lists for all hosts with included blocks 11762 1726853292.26594: done extending task lists 11762 1726853292.26596: done processing included files 11762 1726853292.26597: results queue empty 11762 1726853292.26597: checking for any_errors_fatal 11762 1726853292.26601: done checking for any_errors_fatal 11762 1726853292.26601: checking for max_fail_percentage 11762 1726853292.26603: done checking for max_fail_percentage 11762 1726853292.26603: checking to see if all hosts have failed and the running result is not ok 11762 1726853292.26604: done checking to see if all hosts have failed 11762 1726853292.26605: getting the remaining hosts for this loop 11762 1726853292.26606: done getting the remaining hosts for this loop 11762 1726853292.26608: getting the next task for host managed_node2 11762 1726853292.26612: done getting next task for host managed_node2 11762 1726853292.26614: ^ task is: TASK: TEST: {{ lsr_description }} 11762 1726853292.26616: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853292.26619: getting variables 11762 1726853292.26620: in VariableManager get_vars() 11762 1726853292.26631: Calling all_inventory to load vars for managed_node2 11762 1726853292.26633: Calling groups_inventory to load vars for managed_node2 11762 1726853292.26635: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853292.26641: Calling all_plugins_play to load vars for managed_node2 11762 1726853292.26646: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853292.26649: Calling groups_plugins_play to load vars for managed_node2 11762 1726853292.29160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853292.32690: done with get_vars() 11762 1726853292.32724: done getting variables 11762 1726853292.32777: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853292.33096: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 13:28:12 -0400 (0:00:00.213) 0:00:42.761 ****** 11762 1726853292.33126: entering _queue_task() for managed_node2/debug 11762 1726853292.33921: worker is 1 (out of 1 available) 11762 1726853292.33934: exiting _queue_task() for managed_node2/debug 11762 1726853292.33948: done queuing things up, now waiting for results queue to drain 11762 1726853292.33950: waiting for pending results... 11762 1726853292.34595: running TaskExecutor() for managed_node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 11762 1726853292.34603: in run() - task 02083763-bbaf-d845-03d0-0000000008ea 11762 1726853292.34608: variable 'ansible_search_path' from source: unknown 11762 1726853292.34611: variable 'ansible_search_path' from source: unknown 11762 1726853292.34649: calling self._execute() 11762 1726853292.34902: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.34911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.34999: variable 'omit' from source: magic vars 11762 1726853292.35756: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.35769: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.35908: variable 'omit' from source: magic vars 11762 1726853292.35948: variable 'omit' from source: magic vars 11762 1726853292.36049: variable 'lsr_description' from source: include params 11762 1726853292.36068: variable 'omit' from source: magic vars 11762 1726853292.37017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853292.37055: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853292.37079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853292.37121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.37124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.37135: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853292.37138: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.37145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.37776: Set connection var ansible_timeout to 10 11762 1726853292.37779: Set connection var ansible_shell_type to sh 11762 1726853292.37782: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853292.37784: Set connection var ansible_shell_executable to /bin/sh 11762 1726853292.37786: Set connection var ansible_pipelining to False 11762 1726853292.37788: Set connection var ansible_connection to ssh 11762 1726853292.37804: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.37808: variable 'ansible_connection' from source: unknown 11762 1726853292.37810: variable 'ansible_module_compression' from source: unknown 11762 1726853292.37813: variable 'ansible_shell_type' from source: unknown 11762 1726853292.37815: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.37818: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.37821: variable 'ansible_pipelining' from source: unknown 11762 1726853292.37824: variable 'ansible_timeout' from source: unknown 11762 1726853292.37827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.38320: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853292.38324: variable 'omit' from source: magic vars 11762 1726853292.38327: starting attempt loop 11762 1726853292.38329: running the handler 11762 1726853292.38345: handler run complete 11762 1726853292.38428: attempt loop complete, returning result 11762 1726853292.38431: _execute() done 11762 1726853292.38434: dumping result to json 11762 1726853292.38436: done dumping result, returning 11762 1726853292.38439: done running TaskExecutor() for managed_node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [02083763-bbaf-d845-03d0-0000000008ea] 11762 1726853292.38882: sending task result for task 02083763-bbaf-d845-03d0-0000000008ea 11762 1726853292.39183: done sending task result for task 02083763-bbaf-d845-03d0-0000000008ea 11762 1726853292.39186: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 11762 1726853292.39231: no more pending results, returning what we have 11762 1726853292.39235: results queue empty 11762 1726853292.39235: checking for any_errors_fatal 11762 1726853292.39237: done checking for any_errors_fatal 11762 1726853292.39237: checking for max_fail_percentage 11762 1726853292.39239: done checking for max_fail_percentage 11762 1726853292.39240: checking to see if all hosts have failed and the running result is not ok 11762 1726853292.39241: done checking to see if all hosts have failed 11762 1726853292.39241: getting the remaining hosts for this loop 11762 1726853292.39245: done getting the remaining hosts for this loop 11762 1726853292.39249: getting the next task for host managed_node2 11762 1726853292.39254: done getting next task for host managed_node2 11762 1726853292.39257: ^ task is: TASK: Show item 11762 1726853292.39260: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853292.39264: getting variables 11762 1726853292.39266: in VariableManager get_vars() 11762 1726853292.39310: Calling all_inventory to load vars for managed_node2 11762 1726853292.39314: Calling groups_inventory to load vars for managed_node2 11762 1726853292.39316: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853292.39327: Calling all_plugins_play to load vars for managed_node2 11762 1726853292.39331: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853292.39334: Calling groups_plugins_play to load vars for managed_node2 11762 1726853292.41984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853292.45187: done with get_vars() 11762 1726853292.45215: done getting variables 11762 1726853292.45275: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 13:28:12 -0400 (0:00:00.121) 0:00:42.883 ****** 11762 1726853292.45309: entering _queue_task() for managed_node2/debug 11762 1726853292.46067: worker is 1 (out of 1 available) 11762 1726853292.46084: exiting _queue_task() for managed_node2/debug 11762 1726853292.46100: done queuing things up, now waiting for results queue to drain 11762 1726853292.46101: waiting for pending results... 11762 1726853292.46567: running TaskExecutor() for managed_node2/TASK: Show item 11762 1726853292.46775: in run() - task 02083763-bbaf-d845-03d0-0000000008eb 11762 1726853292.46937: variable 'ansible_search_path' from source: unknown 11762 1726853292.46941: variable 'ansible_search_path' from source: unknown 11762 1726853292.46993: variable 'omit' from source: magic vars 11762 1726853292.47310: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.47323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.47336: variable 'omit' from source: magic vars 11762 1726853292.48556: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.48568: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.48576: variable 'omit' from source: magic vars 11762 1726853292.48614: variable 'omit' from source: magic vars 11762 1726853292.48656: variable 'item' from source: unknown 11762 1726853292.48723: variable 'item' from source: unknown 11762 1726853292.48741: variable 'omit' from source: magic vars 11762 1726853292.49181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853292.49217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853292.49276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853292.49279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.49282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.49302: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853292.49308: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.49313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.49751: Set connection var ansible_timeout to 10 11762 1726853292.49754: Set connection var ansible_shell_type to sh 11762 1726853292.49757: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853292.49759: Set connection var ansible_shell_executable to /bin/sh 11762 1726853292.49761: Set connection var ansible_pipelining to False 11762 1726853292.49763: Set connection var ansible_connection to ssh 11762 1726853292.49766: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.49768: variable 'ansible_connection' from source: unknown 11762 1726853292.49772: variable 'ansible_module_compression' from source: unknown 11762 1726853292.49775: variable 'ansible_shell_type' from source: unknown 11762 1726853292.49778: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.49780: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.49782: variable 'ansible_pipelining' from source: unknown 11762 1726853292.49784: variable 'ansible_timeout' from source: unknown 11762 1726853292.49786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.49859: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853292.49863: variable 'omit' from source: magic vars 11762 1726853292.49866: starting attempt loop 11762 1726853292.49868: running the handler 11762 1726853292.49870: variable 'lsr_description' from source: include params 11762 1726853292.49933: variable 'lsr_description' from source: include params 11762 1726853292.49945: handler run complete 11762 1726853292.49960: attempt loop complete, returning result 11762 1726853292.50176: variable 'item' from source: unknown 11762 1726853292.50181: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 11762 1726853292.50320: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.50324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.50327: variable 'omit' from source: magic vars 11762 1726853292.50775: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.50778: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.50781: variable 'omit' from source: magic vars 11762 1726853292.50783: variable 'omit' from source: magic vars 11762 1726853292.50785: variable 'item' from source: unknown 11762 1726853292.50787: variable 'item' from source: unknown 11762 1726853292.50789: variable 'omit' from source: magic vars 11762 1726853292.50791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853292.50793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.50796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.50798: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853292.50800: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.50802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.50804: Set connection var ansible_timeout to 10 11762 1726853292.50805: Set connection var ansible_shell_type to sh 11762 1726853292.50807: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853292.50809: Set connection var ansible_shell_executable to /bin/sh 11762 1726853292.50811: Set connection var ansible_pipelining to False 11762 1726853292.50813: Set connection var ansible_connection to ssh 11762 1726853292.50815: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.50817: variable 'ansible_connection' from source: unknown 11762 1726853292.50819: variable 'ansible_module_compression' from source: unknown 11762 1726853292.50821: variable 'ansible_shell_type' from source: unknown 11762 1726853292.50823: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.50825: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.50827: variable 'ansible_pipelining' from source: unknown 11762 1726853292.50829: variable 'ansible_timeout' from source: unknown 11762 1726853292.50831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.50833: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853292.50835: variable 'omit' from source: magic vars 11762 1726853292.50837: starting attempt loop 11762 1726853292.50839: running the handler 11762 1726853292.50841: variable 'lsr_setup' from source: include params 11762 1726853292.50878: variable 'lsr_setup' from source: include params 11762 1726853292.50923: handler run complete 11762 1726853292.50940: attempt loop complete, returning result 11762 1726853292.50954: variable 'item' from source: unknown 11762 1726853292.51385: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 11762 1726853292.51463: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.51466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.51469: variable 'omit' from source: magic vars 11762 1726853292.51474: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.51477: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.51479: variable 'omit' from source: magic vars 11762 1726853292.51482: variable 'omit' from source: magic vars 11762 1726853292.51484: variable 'item' from source: unknown 11762 1726853292.51487: variable 'item' from source: unknown 11762 1726853292.51490: variable 'omit' from source: magic vars 11762 1726853292.51493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853292.51506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.51512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.51522: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853292.51525: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.51527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.51602: Set connection var ansible_timeout to 10 11762 1726853292.51607: Set connection var ansible_shell_type to sh 11762 1726853292.51876: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853292.51879: Set connection var ansible_shell_executable to /bin/sh 11762 1726853292.51882: Set connection var ansible_pipelining to False 11762 1726853292.51884: Set connection var ansible_connection to ssh 11762 1726853292.51886: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.51888: variable 'ansible_connection' from source: unknown 11762 1726853292.51890: variable 'ansible_module_compression' from source: unknown 11762 1726853292.51892: variable 'ansible_shell_type' from source: unknown 11762 1726853292.51894: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.51896: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.51898: variable 'ansible_pipelining' from source: unknown 11762 1726853292.51900: variable 'ansible_timeout' from source: unknown 11762 1726853292.51902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.51904: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853292.51906: variable 'omit' from source: magic vars 11762 1726853292.51908: starting attempt loop 11762 1726853292.51910: running the handler 11762 1726853292.51912: variable 'lsr_test' from source: include params 11762 1726853292.51914: variable 'lsr_test' from source: include params 11762 1726853292.51916: handler run complete 11762 1726853292.51917: attempt loop complete, returning result 11762 1726853292.51919: variable 'item' from source: unknown 11762 1726853292.51972: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile_reconfigure.yml" ] } 11762 1726853292.52056: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.52059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.52070: variable 'omit' from source: magic vars 11762 1726853292.52375: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.52379: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.52381: variable 'omit' from source: magic vars 11762 1726853292.52384: variable 'omit' from source: magic vars 11762 1726853292.52386: variable 'item' from source: unknown 11762 1726853292.52388: variable 'item' from source: unknown 11762 1726853292.52391: variable 'omit' from source: magic vars 11762 1726853292.52394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853292.52396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.52399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.52402: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853292.52412: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.52415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.52488: Set connection var ansible_timeout to 10 11762 1726853292.52491: Set connection var ansible_shell_type to sh 11762 1726853292.52494: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853292.52501: Set connection var ansible_shell_executable to /bin/sh 11762 1726853292.52508: Set connection var ansible_pipelining to False 11762 1726853292.52520: Set connection var ansible_connection to ssh 11762 1726853292.52537: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.52540: variable 'ansible_connection' from source: unknown 11762 1726853292.52545: variable 'ansible_module_compression' from source: unknown 11762 1726853292.52548: variable 'ansible_shell_type' from source: unknown 11762 1726853292.52550: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.52552: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.52555: variable 'ansible_pipelining' from source: unknown 11762 1726853292.52557: variable 'ansible_timeout' from source: unknown 11762 1726853292.52559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.52876: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853292.52879: variable 'omit' from source: magic vars 11762 1726853292.52881: starting attempt loop 11762 1726853292.52884: running the handler 11762 1726853292.52886: variable 'lsr_assert' from source: include params 11762 1726853292.52888: variable 'lsr_assert' from source: include params 11762 1726853292.52890: handler run complete 11762 1726853292.52892: attempt loop complete, returning result 11762 1726853292.52894: variable 'item' from source: unknown 11762 1726853292.52896: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_bond_options.yml" ] } 11762 1726853292.52961: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.52966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.52969: variable 'omit' from source: magic vars 11762 1726853292.53154: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.53285: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.53288: variable 'omit' from source: magic vars 11762 1726853292.53291: variable 'omit' from source: magic vars 11762 1726853292.53293: variable 'item' from source: unknown 11762 1726853292.53295: variable 'item' from source: unknown 11762 1726853292.53297: variable 'omit' from source: magic vars 11762 1726853292.53312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853292.53319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.53325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.53341: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853292.53346: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.53349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.53420: Set connection var ansible_timeout to 10 11762 1726853292.53423: Set connection var ansible_shell_type to sh 11762 1726853292.53428: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853292.53434: Set connection var ansible_shell_executable to /bin/sh 11762 1726853292.53447: Set connection var ansible_pipelining to False 11762 1726853292.53453: Set connection var ansible_connection to ssh 11762 1726853292.53472: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.53475: variable 'ansible_connection' from source: unknown 11762 1726853292.53478: variable 'ansible_module_compression' from source: unknown 11762 1726853292.53480: variable 'ansible_shell_type' from source: unknown 11762 1726853292.53482: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.53484: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.53490: variable 'ansible_pipelining' from source: unknown 11762 1726853292.53492: variable 'ansible_timeout' from source: unknown 11762 1726853292.53524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.53588: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853292.53595: variable 'omit' from source: magic vars 11762 1726853292.53600: starting attempt loop 11762 1726853292.53603: running the handler 11762 1726853292.53713: handler run complete 11762 1726853292.53717: attempt loop complete, returning result 11762 1726853292.53729: variable 'item' from source: unknown 11762 1726853292.53976: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 11762 1726853292.54035: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.54038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.54040: variable 'omit' from source: magic vars 11762 1726853292.54046: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.54048: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.54051: variable 'omit' from source: magic vars 11762 1726853292.54053: variable 'omit' from source: magic vars 11762 1726853292.54123: variable 'item' from source: unknown 11762 1726853292.54146: variable 'item' from source: unknown 11762 1726853292.54157: variable 'omit' from source: magic vars 11762 1726853292.54183: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853292.54190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.54196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.54207: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853292.54210: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.54212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.54503: Set connection var ansible_timeout to 10 11762 1726853292.54506: Set connection var ansible_shell_type to sh 11762 1726853292.54508: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853292.54510: Set connection var ansible_shell_executable to /bin/sh 11762 1726853292.54512: Set connection var ansible_pipelining to False 11762 1726853292.54514: Set connection var ansible_connection to ssh 11762 1726853292.54516: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.54518: variable 'ansible_connection' from source: unknown 11762 1726853292.54520: variable 'ansible_module_compression' from source: unknown 11762 1726853292.54521: variable 'ansible_shell_type' from source: unknown 11762 1726853292.54523: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.54525: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.54527: variable 'ansible_pipelining' from source: unknown 11762 1726853292.54529: variable 'ansible_timeout' from source: unknown 11762 1726853292.54531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.54533: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853292.54535: variable 'omit' from source: magic vars 11762 1726853292.54537: starting attempt loop 11762 1726853292.54539: running the handler 11762 1726853292.54541: variable 'lsr_fail_debug' from source: play vars 11762 1726853292.54546: variable 'lsr_fail_debug' from source: play vars 11762 1726853292.54548: handler run complete 11762 1726853292.54878: attempt loop complete, returning result 11762 1726853292.54881: variable 'item' from source: unknown 11762 1726853292.54883: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 11762 1726853292.54947: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.54951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.54953: variable 'omit' from source: magic vars 11762 1726853292.54956: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.54958: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.54960: variable 'omit' from source: magic vars 11762 1726853292.54962: variable 'omit' from source: magic vars 11762 1726853292.55108: variable 'item' from source: unknown 11762 1726853292.55163: variable 'item' from source: unknown 11762 1726853292.55174: variable 'omit' from source: magic vars 11762 1726853292.55393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853292.55399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.55407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.55416: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853292.55419: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.55422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.55493: Set connection var ansible_timeout to 10 11762 1726853292.55496: Set connection var ansible_shell_type to sh 11762 1726853292.55501: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853292.55506: Set connection var ansible_shell_executable to /bin/sh 11762 1726853292.55514: Set connection var ansible_pipelining to False 11762 1726853292.55521: Set connection var ansible_connection to ssh 11762 1726853292.55541: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.55547: variable 'ansible_connection' from source: unknown 11762 1726853292.55550: variable 'ansible_module_compression' from source: unknown 11762 1726853292.55552: variable 'ansible_shell_type' from source: unknown 11762 1726853292.55554: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.55556: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.55559: variable 'ansible_pipelining' from source: unknown 11762 1726853292.55561: variable 'ansible_timeout' from source: unknown 11762 1726853292.55563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.55876: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853292.55879: variable 'omit' from source: magic vars 11762 1726853292.55882: starting attempt loop 11762 1726853292.55884: running the handler 11762 1726853292.55886: variable 'lsr_cleanup' from source: include params 11762 1726853292.55938: variable 'lsr_cleanup' from source: include params 11762 1726853292.55957: handler run complete 11762 1726853292.55970: attempt loop complete, returning result 11762 1726853292.55985: variable 'item' from source: unknown 11762 1726853292.56153: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml", "tasks/check_network_dns.yml" ] } 11762 1726853292.56676: dumping result to json 11762 1726853292.56680: done dumping result, returning 11762 1726853292.56683: done running TaskExecutor() for managed_node2/TASK: Show item [02083763-bbaf-d845-03d0-0000000008eb] 11762 1726853292.56685: sending task result for task 02083763-bbaf-d845-03d0-0000000008eb 11762 1726853292.56734: done sending task result for task 02083763-bbaf-d845-03d0-0000000008eb 11762 1726853292.56737: WORKER PROCESS EXITING 11762 1726853292.56793: no more pending results, returning what we have 11762 1726853292.56798: results queue empty 11762 1726853292.56799: checking for any_errors_fatal 11762 1726853292.56806: done checking for any_errors_fatal 11762 1726853292.56807: checking for max_fail_percentage 11762 1726853292.56809: done checking for max_fail_percentage 11762 1726853292.56810: checking to see if all hosts have failed and the running result is not ok 11762 1726853292.56811: done checking to see if all hosts have failed 11762 1726853292.56812: getting the remaining hosts for this loop 11762 1726853292.56814: done getting the remaining hosts for this loop 11762 1726853292.56818: getting the next task for host managed_node2 11762 1726853292.56826: done getting next task for host managed_node2 11762 1726853292.56829: ^ task is: TASK: Include the task 'show_interfaces.yml' 11762 1726853292.56832: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853292.56836: getting variables 11762 1726853292.56837: in VariableManager get_vars() 11762 1726853292.56880: Calling all_inventory to load vars for managed_node2 11762 1726853292.56884: Calling groups_inventory to load vars for managed_node2 11762 1726853292.56887: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853292.56897: Calling all_plugins_play to load vars for managed_node2 11762 1726853292.56901: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853292.56904: Calling groups_plugins_play to load vars for managed_node2 11762 1726853292.60473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853292.63898: done with get_vars() 11762 1726853292.63930: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 13:28:12 -0400 (0:00:00.187) 0:00:43.070 ****** 11762 1726853292.64055: entering _queue_task() for managed_node2/include_tasks 11762 1726853292.64467: worker is 1 (out of 1 available) 11762 1726853292.64482: exiting _queue_task() for managed_node2/include_tasks 11762 1726853292.64496: done queuing things up, now waiting for results queue to drain 11762 1726853292.64498: waiting for pending results... 11762 1726853292.64894: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 11762 1726853292.64930: in run() - task 02083763-bbaf-d845-03d0-0000000008ec 11762 1726853292.64952: variable 'ansible_search_path' from source: unknown 11762 1726853292.65008: variable 'ansible_search_path' from source: unknown 11762 1726853292.65014: calling self._execute() 11762 1726853292.65127: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.65141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.65157: variable 'omit' from source: magic vars 11762 1726853292.65562: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.65582: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.65660: _execute() done 11762 1726853292.65664: dumping result to json 11762 1726853292.65667: done dumping result, returning 11762 1726853292.65669: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-d845-03d0-0000000008ec] 11762 1726853292.65673: sending task result for task 02083763-bbaf-d845-03d0-0000000008ec 11762 1726853292.65740: done sending task result for task 02083763-bbaf-d845-03d0-0000000008ec 11762 1726853292.65744: WORKER PROCESS EXITING 11762 1726853292.65774: no more pending results, returning what we have 11762 1726853292.65780: in VariableManager get_vars() 11762 1726853292.65829: Calling all_inventory to load vars for managed_node2 11762 1726853292.65832: Calling groups_inventory to load vars for managed_node2 11762 1726853292.65835: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853292.65849: Calling all_plugins_play to load vars for managed_node2 11762 1726853292.65853: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853292.65857: Calling groups_plugins_play to load vars for managed_node2 11762 1726853292.67868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853292.69702: done with get_vars() 11762 1726853292.69722: variable 'ansible_search_path' from source: unknown 11762 1726853292.69723: variable 'ansible_search_path' from source: unknown 11762 1726853292.69759: we have included files to process 11762 1726853292.69760: generating all_blocks data 11762 1726853292.69762: done generating all_blocks data 11762 1726853292.69771: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11762 1726853292.69773: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11762 1726853292.69775: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11762 1726853292.69884: in VariableManager get_vars() 11762 1726853292.69908: done with get_vars() 11762 1726853292.70023: done processing included file 11762 1726853292.70025: iterating over new_blocks loaded from include file 11762 1726853292.70026: in VariableManager get_vars() 11762 1726853292.70040: done with get_vars() 11762 1726853292.70042: filtering new block on tags 11762 1726853292.70070: done filtering new block on tags 11762 1726853292.70074: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 11762 1726853292.70079: extending task lists for all hosts with included blocks 11762 1726853292.70694: done extending task lists 11762 1726853292.70695: done processing included files 11762 1726853292.70696: results queue empty 11762 1726853292.70697: checking for any_errors_fatal 11762 1726853292.70703: done checking for any_errors_fatal 11762 1726853292.70704: checking for max_fail_percentage 11762 1726853292.70705: done checking for max_fail_percentage 11762 1726853292.70706: checking to see if all hosts have failed and the running result is not ok 11762 1726853292.70707: done checking to see if all hosts have failed 11762 1726853292.70707: getting the remaining hosts for this loop 11762 1726853292.70708: done getting the remaining hosts for this loop 11762 1726853292.70710: getting the next task for host managed_node2 11762 1726853292.70714: done getting next task for host managed_node2 11762 1726853292.70715: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 11762 1726853292.70718: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853292.70720: getting variables 11762 1726853292.70721: in VariableManager get_vars() 11762 1726853292.70731: Calling all_inventory to load vars for managed_node2 11762 1726853292.70732: Calling groups_inventory to load vars for managed_node2 11762 1726853292.70734: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853292.70739: Calling all_plugins_play to load vars for managed_node2 11762 1726853292.70741: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853292.70745: Calling groups_plugins_play to load vars for managed_node2 11762 1726853292.72576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853292.74218: done with get_vars() 11762 1726853292.74244: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:28:12 -0400 (0:00:00.102) 0:00:43.173 ****** 11762 1726853292.74330: entering _queue_task() for managed_node2/include_tasks 11762 1726853292.74703: worker is 1 (out of 1 available) 11762 1726853292.74831: exiting _queue_task() for managed_node2/include_tasks 11762 1726853292.74843: done queuing things up, now waiting for results queue to drain 11762 1726853292.74845: waiting for pending results... 11762 1726853292.75193: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 11762 1726853292.75199: in run() - task 02083763-bbaf-d845-03d0-000000000913 11762 1726853292.75203: variable 'ansible_search_path' from source: unknown 11762 1726853292.75206: variable 'ansible_search_path' from source: unknown 11762 1726853292.75229: calling self._execute() 11762 1726853292.75336: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.75349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.75366: variable 'omit' from source: magic vars 11762 1726853292.75832: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.75835: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.75838: _execute() done 11762 1726853292.75841: dumping result to json 11762 1726853292.75843: done dumping result, returning 11762 1726853292.75846: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-d845-03d0-000000000913] 11762 1726853292.75848: sending task result for task 02083763-bbaf-d845-03d0-000000000913 11762 1726853292.75921: done sending task result for task 02083763-bbaf-d845-03d0-000000000913 11762 1726853292.75924: WORKER PROCESS EXITING 11762 1726853292.75962: no more pending results, returning what we have 11762 1726853292.75969: in VariableManager get_vars() 11762 1726853292.76017: Calling all_inventory to load vars for managed_node2 11762 1726853292.76021: Calling groups_inventory to load vars for managed_node2 11762 1726853292.76023: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853292.76037: Calling all_plugins_play to load vars for managed_node2 11762 1726853292.76040: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853292.76043: Calling groups_plugins_play to load vars for managed_node2 11762 1726853292.77744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853292.79506: done with get_vars() 11762 1726853292.79526: variable 'ansible_search_path' from source: unknown 11762 1726853292.79528: variable 'ansible_search_path' from source: unknown 11762 1726853292.79572: we have included files to process 11762 1726853292.79573: generating all_blocks data 11762 1726853292.79575: done generating all_blocks data 11762 1726853292.79577: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11762 1726853292.79578: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11762 1726853292.79580: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11762 1726853292.79858: done processing included file 11762 1726853292.79860: iterating over new_blocks loaded from include file 11762 1726853292.79862: in VariableManager get_vars() 11762 1726853292.79886: done with get_vars() 11762 1726853292.79888: filtering new block on tags 11762 1726853292.79927: done filtering new block on tags 11762 1726853292.79930: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 11762 1726853292.79935: extending task lists for all hosts with included blocks 11762 1726853292.80094: done extending task lists 11762 1726853292.80095: done processing included files 11762 1726853292.80096: results queue empty 11762 1726853292.80097: checking for any_errors_fatal 11762 1726853292.80100: done checking for any_errors_fatal 11762 1726853292.80100: checking for max_fail_percentage 11762 1726853292.80102: done checking for max_fail_percentage 11762 1726853292.80102: checking to see if all hosts have failed and the running result is not ok 11762 1726853292.80103: done checking to see if all hosts have failed 11762 1726853292.80104: getting the remaining hosts for this loop 11762 1726853292.80105: done getting the remaining hosts for this loop 11762 1726853292.80108: getting the next task for host managed_node2 11762 1726853292.80112: done getting next task for host managed_node2 11762 1726853292.80114: ^ task is: TASK: Gather current interface info 11762 1726853292.80118: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853292.80120: getting variables 11762 1726853292.80121: in VariableManager get_vars() 11762 1726853292.80132: Calling all_inventory to load vars for managed_node2 11762 1726853292.80135: Calling groups_inventory to load vars for managed_node2 11762 1726853292.80136: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853292.80142: Calling all_plugins_play to load vars for managed_node2 11762 1726853292.80145: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853292.80148: Calling groups_plugins_play to load vars for managed_node2 11762 1726853292.81343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853292.82828: done with get_vars() 11762 1726853292.82853: done getting variables 11762 1726853292.82900: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:28:12 -0400 (0:00:00.086) 0:00:43.259 ****** 11762 1726853292.82938: entering _queue_task() for managed_node2/command 11762 1726853292.83498: worker is 1 (out of 1 available) 11762 1726853292.83508: exiting _queue_task() for managed_node2/command 11762 1726853292.83520: done queuing things up, now waiting for results queue to drain 11762 1726853292.83521: waiting for pending results... 11762 1726853292.83691: running TaskExecutor() for managed_node2/TASK: Gather current interface info 11762 1726853292.83784: in run() - task 02083763-bbaf-d845-03d0-00000000094e 11762 1726853292.83859: variable 'ansible_search_path' from source: unknown 11762 1726853292.83863: variable 'ansible_search_path' from source: unknown 11762 1726853292.83866: calling self._execute() 11762 1726853292.83955: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.83977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.83992: variable 'omit' from source: magic vars 11762 1726853292.84368: variable 'ansible_distribution_major_version' from source: facts 11762 1726853292.84388: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853292.84407: variable 'omit' from source: magic vars 11762 1726853292.84462: variable 'omit' from source: magic vars 11762 1726853292.84512: variable 'omit' from source: magic vars 11762 1726853292.84576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853292.84594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853292.84625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853292.84649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.84666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853292.84730: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853292.84734: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.84736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.84826: Set connection var ansible_timeout to 10 11762 1726853292.84841: Set connection var ansible_shell_type to sh 11762 1726853292.84947: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853292.84950: Set connection var ansible_shell_executable to /bin/sh 11762 1726853292.84953: Set connection var ansible_pipelining to False 11762 1726853292.84955: Set connection var ansible_connection to ssh 11762 1726853292.84957: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.84959: variable 'ansible_connection' from source: unknown 11762 1726853292.84961: variable 'ansible_module_compression' from source: unknown 11762 1726853292.84964: variable 'ansible_shell_type' from source: unknown 11762 1726853292.84966: variable 'ansible_shell_executable' from source: unknown 11762 1726853292.84968: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853292.84970: variable 'ansible_pipelining' from source: unknown 11762 1726853292.84974: variable 'ansible_timeout' from source: unknown 11762 1726853292.84976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853292.85110: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853292.85130: variable 'omit' from source: magic vars 11762 1726853292.85141: starting attempt loop 11762 1726853292.85149: running the handler 11762 1726853292.85178: _low_level_execute_command(): starting 11762 1726853292.85191: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853292.85996: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853292.86070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853292.86092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853292.86114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853292.86279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853292.88312: stdout chunk (state=3): >>>/root <<< 11762 1726853292.88316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853292.88320: stdout chunk (state=3): >>><<< 11762 1726853292.88322: stderr chunk (state=3): >>><<< 11762 1726853292.88691: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853292.88695: _low_level_execute_command(): starting 11762 1726853292.88699: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079 `" && echo ansible-tmp-1726853292.8843863-13901-176702577484079="` echo /root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079 `" ) && sleep 0' 11762 1726853292.89967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853292.90292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853292.90378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853292.90500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853292.92564: stdout chunk (state=3): >>>ansible-tmp-1726853292.8843863-13901-176702577484079=/root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079 <<< 11762 1726853292.92721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853292.92763: stderr chunk (state=3): >>><<< 11762 1726853292.92770: stdout chunk (state=3): >>><<< 11762 1726853292.92796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853292.8843863-13901-176702577484079=/root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853292.92828: variable 'ansible_module_compression' from source: unknown 11762 1726853292.92882: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853292.93008: variable 'ansible_facts' from source: unknown 11762 1726853292.93384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079/AnsiballZ_command.py 11762 1726853292.93691: Sending initial data 11762 1726853292.93695: Sent initial data (156 bytes) 11762 1726853292.95030: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853292.95281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853292.95294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853292.95309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853292.95476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853292.97289: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853292.97349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853292.97421: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp243c6wr_ /root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079/AnsiballZ_command.py <<< 11762 1726853292.97425: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079/AnsiballZ_command.py" <<< 11762 1726853292.97553: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp243c6wr_" to remote "/root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079/AnsiballZ_command.py" <<< 11762 1726853292.99173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853292.99195: stderr chunk (state=3): >>><<< 11762 1726853292.99206: stdout chunk (state=3): >>><<< 11762 1726853292.99231: done transferring module to remote 11762 1726853292.99241: _low_level_execute_command(): starting 11762 1726853292.99247: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079/ /root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079/AnsiballZ_command.py && sleep 0' 11762 1726853293.00480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853293.00554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853293.00606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853293.00610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853293.00688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853293.00961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853293.02785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853293.02788: stderr chunk (state=3): >>><<< 11762 1726853293.02791: stdout chunk (state=3): >>><<< 11762 1726853293.02829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853293.02837: _low_level_execute_command(): starting 11762 1726853293.02840: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079/AnsiballZ_command.py && sleep 0' 11762 1726853293.03940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853293.04037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853293.04041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853293.04047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853293.04050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853293.04053: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853293.04133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853293.04253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853293.04269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853293.04456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853293.20414: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:28:13.199627", "end": "2024-09-20 13:28:13.203064", "delta": "0:00:00.003437", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853293.22021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853293.22076: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 11762 1726853293.22080: stdout chunk (state=3): >>><<< 11762 1726853293.22092: stderr chunk (state=3): >>><<< 11762 1726853293.22202: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:28:13.199627", "end": "2024-09-20 13:28:13.203064", "delta": "0:00:00.003437", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853293.22206: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853293.22209: _low_level_execute_command(): starting 11762 1726853293.22212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853292.8843863-13901-176702577484079/ > /dev/null 2>&1 && sleep 0' 11762 1726853293.23360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853293.23364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853293.23576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853293.23592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853293.23687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853293.25561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853293.25791: stderr chunk (state=3): >>><<< 11762 1726853293.25795: stdout chunk (state=3): >>><<< 11762 1726853293.25811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853293.25818: handler run complete 11762 1726853293.25847: Evaluated conditional (False): False 11762 1726853293.25895: attempt loop complete, returning result 11762 1726853293.25899: _execute() done 11762 1726853293.25901: dumping result to json 11762 1726853293.25904: done dumping result, returning 11762 1726853293.25906: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [02083763-bbaf-d845-03d0-00000000094e] 11762 1726853293.26087: sending task result for task 02083763-bbaf-d845-03d0-00000000094e ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003437", "end": "2024-09-20 13:28:13.203064", "rc": 0, "start": "2024-09-20 13:28:13.199627" } STDOUT: bonding_masters eth0 lo 11762 1726853293.26378: no more pending results, returning what we have 11762 1726853293.26383: results queue empty 11762 1726853293.26384: checking for any_errors_fatal 11762 1726853293.26385: done checking for any_errors_fatal 11762 1726853293.26386: checking for max_fail_percentage 11762 1726853293.26388: done checking for max_fail_percentage 11762 1726853293.26389: checking to see if all hosts have failed and the running result is not ok 11762 1726853293.26390: done checking to see if all hosts have failed 11762 1726853293.26390: getting the remaining hosts for this loop 11762 1726853293.26392: done getting the remaining hosts for this loop 11762 1726853293.26396: getting the next task for host managed_node2 11762 1726853293.26403: done getting next task for host managed_node2 11762 1726853293.26405: ^ task is: TASK: Set current_interfaces 11762 1726853293.26410: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853293.26415: getting variables 11762 1726853293.26416: in VariableManager get_vars() 11762 1726853293.26457: Calling all_inventory to load vars for managed_node2 11762 1726853293.26460: Calling groups_inventory to load vars for managed_node2 11762 1726853293.26462: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853293.26675: Calling all_plugins_play to load vars for managed_node2 11762 1726853293.26680: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853293.26684: Calling groups_plugins_play to load vars for managed_node2 11762 1726853293.27384: done sending task result for task 02083763-bbaf-d845-03d0-00000000094e 11762 1726853293.27388: WORKER PROCESS EXITING 11762 1726853293.29548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853293.32739: done with get_vars() 11762 1726853293.32770: done getting variables 11762 1726853293.32833: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:28:13 -0400 (0:00:00.499) 0:00:43.759 ****** 11762 1726853293.32869: entering _queue_task() for managed_node2/set_fact 11762 1726853293.33632: worker is 1 (out of 1 available) 11762 1726853293.33648: exiting _queue_task() for managed_node2/set_fact 11762 1726853293.33663: done queuing things up, now waiting for results queue to drain 11762 1726853293.33665: waiting for pending results... 11762 1726853293.34158: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 11762 1726853293.34264: in run() - task 02083763-bbaf-d845-03d0-00000000094f 11762 1726853293.34484: variable 'ansible_search_path' from source: unknown 11762 1726853293.34489: variable 'ansible_search_path' from source: unknown 11762 1726853293.34529: calling self._execute() 11762 1726853293.34699: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.34706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.34832: variable 'omit' from source: magic vars 11762 1726853293.35527: variable 'ansible_distribution_major_version' from source: facts 11762 1726853293.35539: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853293.35546: variable 'omit' from source: magic vars 11762 1726853293.35702: variable 'omit' from source: magic vars 11762 1726853293.35883: variable '_current_interfaces' from source: set_fact 11762 1726853293.35950: variable 'omit' from source: magic vars 11762 1726853293.35988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853293.36139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853293.36161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853293.36245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853293.36255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853293.36385: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853293.36388: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.36391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.36507: Set connection var ansible_timeout to 10 11762 1726853293.36511: Set connection var ansible_shell_type to sh 11762 1726853293.36517: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853293.36522: Set connection var ansible_shell_executable to /bin/sh 11762 1726853293.36530: Set connection var ansible_pipelining to False 11762 1726853293.36538: Set connection var ansible_connection to ssh 11762 1726853293.36674: variable 'ansible_shell_executable' from source: unknown 11762 1726853293.36678: variable 'ansible_connection' from source: unknown 11762 1726853293.36681: variable 'ansible_module_compression' from source: unknown 11762 1726853293.36684: variable 'ansible_shell_type' from source: unknown 11762 1726853293.36688: variable 'ansible_shell_executable' from source: unknown 11762 1726853293.36690: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.36695: variable 'ansible_pipelining' from source: unknown 11762 1726853293.36697: variable 'ansible_timeout' from source: unknown 11762 1726853293.36701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.36946: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853293.36956: variable 'omit' from source: magic vars 11762 1726853293.36961: starting attempt loop 11762 1726853293.36964: running the handler 11762 1726853293.37081: handler run complete 11762 1726853293.37257: attempt loop complete, returning result 11762 1726853293.37259: _execute() done 11762 1726853293.37261: dumping result to json 11762 1726853293.37263: done dumping result, returning 11762 1726853293.37266: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [02083763-bbaf-d845-03d0-00000000094f] 11762 1726853293.37268: sending task result for task 02083763-bbaf-d845-03d0-00000000094f 11762 1726853293.37332: done sending task result for task 02083763-bbaf-d845-03d0-00000000094f 11762 1726853293.37336: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 11762 1726853293.37420: no more pending results, returning what we have 11762 1726853293.37424: results queue empty 11762 1726853293.37425: checking for any_errors_fatal 11762 1726853293.37436: done checking for any_errors_fatal 11762 1726853293.37437: checking for max_fail_percentage 11762 1726853293.37439: done checking for max_fail_percentage 11762 1726853293.37440: checking to see if all hosts have failed and the running result is not ok 11762 1726853293.37441: done checking to see if all hosts have failed 11762 1726853293.37441: getting the remaining hosts for this loop 11762 1726853293.37446: done getting the remaining hosts for this loop 11762 1726853293.37449: getting the next task for host managed_node2 11762 1726853293.37463: done getting next task for host managed_node2 11762 1726853293.37466: ^ task is: TASK: Show current_interfaces 11762 1726853293.37470: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853293.37475: getting variables 11762 1726853293.37477: in VariableManager get_vars() 11762 1726853293.37517: Calling all_inventory to load vars for managed_node2 11762 1726853293.37520: Calling groups_inventory to load vars for managed_node2 11762 1726853293.37522: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853293.37532: Calling all_plugins_play to load vars for managed_node2 11762 1726853293.37535: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853293.37537: Calling groups_plugins_play to load vars for managed_node2 11762 1726853293.39099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853293.40720: done with get_vars() 11762 1726853293.40750: done getting variables 11762 1726853293.40813: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:28:13 -0400 (0:00:00.079) 0:00:43.838 ****** 11762 1726853293.40847: entering _queue_task() for managed_node2/debug 11762 1726853293.41206: worker is 1 (out of 1 available) 11762 1726853293.41220: exiting _queue_task() for managed_node2/debug 11762 1726853293.41348: done queuing things up, now waiting for results queue to drain 11762 1726853293.41350: waiting for pending results... 11762 1726853293.41674: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 11762 1726853293.41697: in run() - task 02083763-bbaf-d845-03d0-000000000914 11762 1726853293.41719: variable 'ansible_search_path' from source: unknown 11762 1726853293.41736: variable 'ansible_search_path' from source: unknown 11762 1726853293.41786: calling self._execute() 11762 1726853293.41919: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.41933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.41975: variable 'omit' from source: magic vars 11762 1726853293.42429: variable 'ansible_distribution_major_version' from source: facts 11762 1726853293.42433: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853293.42438: variable 'omit' from source: magic vars 11762 1726853293.42482: variable 'omit' from source: magic vars 11762 1726853293.42589: variable 'current_interfaces' from source: set_fact 11762 1726853293.42624: variable 'omit' from source: magic vars 11762 1726853293.42679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853293.42758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853293.42763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853293.42778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853293.42866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853293.42869: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853293.42874: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.42876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.42949: Set connection var ansible_timeout to 10 11762 1726853293.42958: Set connection var ansible_shell_type to sh 11762 1726853293.42974: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853293.42987: Set connection var ansible_shell_executable to /bin/sh 11762 1726853293.43002: Set connection var ansible_pipelining to False 11762 1726853293.43012: Set connection var ansible_connection to ssh 11762 1726853293.43038: variable 'ansible_shell_executable' from source: unknown 11762 1726853293.43075: variable 'ansible_connection' from source: unknown 11762 1726853293.43082: variable 'ansible_module_compression' from source: unknown 11762 1726853293.43085: variable 'ansible_shell_type' from source: unknown 11762 1726853293.43087: variable 'ansible_shell_executable' from source: unknown 11762 1726853293.43089: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.43091: variable 'ansible_pipelining' from source: unknown 11762 1726853293.43093: variable 'ansible_timeout' from source: unknown 11762 1726853293.43097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.43270: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853293.43289: variable 'omit' from source: magic vars 11762 1726853293.43303: starting attempt loop 11762 1726853293.43313: running the handler 11762 1726853293.43365: handler run complete 11762 1726853293.43386: attempt loop complete, returning result 11762 1726853293.43393: _execute() done 11762 1726853293.43399: dumping result to json 11762 1726853293.43405: done dumping result, returning 11762 1726853293.43423: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [02083763-bbaf-d845-03d0-000000000914] 11762 1726853293.43434: sending task result for task 02083763-bbaf-d845-03d0-000000000914 ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 11762 1726853293.43684: no more pending results, returning what we have 11762 1726853293.43688: results queue empty 11762 1726853293.43689: checking for any_errors_fatal 11762 1726853293.43697: done checking for any_errors_fatal 11762 1726853293.43698: checking for max_fail_percentage 11762 1726853293.43700: done checking for max_fail_percentage 11762 1726853293.43701: checking to see if all hosts have failed and the running result is not ok 11762 1726853293.43702: done checking to see if all hosts have failed 11762 1726853293.43702: getting the remaining hosts for this loop 11762 1726853293.43704: done getting the remaining hosts for this loop 11762 1726853293.43707: getting the next task for host managed_node2 11762 1726853293.43722: done getting next task for host managed_node2 11762 1726853293.43729: ^ task is: TASK: Setup 11762 1726853293.43976: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853293.43981: getting variables 11762 1726853293.43983: in VariableManager get_vars() 11762 1726853293.44019: Calling all_inventory to load vars for managed_node2 11762 1726853293.44021: Calling groups_inventory to load vars for managed_node2 11762 1726853293.44024: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853293.44036: Calling all_plugins_play to load vars for managed_node2 11762 1726853293.44039: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853293.44042: Calling groups_plugins_play to load vars for managed_node2 11762 1726853293.44777: done sending task result for task 02083763-bbaf-d845-03d0-000000000914 11762 1726853293.44781: WORKER PROCESS EXITING 11762 1726853293.46024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853293.47709: done with get_vars() 11762 1726853293.47734: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 13:28:13 -0400 (0:00:00.069) 0:00:43.908 ****** 11762 1726853293.47836: entering _queue_task() for managed_node2/include_tasks 11762 1726853293.48184: worker is 1 (out of 1 available) 11762 1726853293.48196: exiting _queue_task() for managed_node2/include_tasks 11762 1726853293.48208: done queuing things up, now waiting for results queue to drain 11762 1726853293.48210: waiting for pending results... 11762 1726853293.48506: running TaskExecutor() for managed_node2/TASK: Setup 11762 1726853293.48616: in run() - task 02083763-bbaf-d845-03d0-0000000008ed 11762 1726853293.48638: variable 'ansible_search_path' from source: unknown 11762 1726853293.48646: variable 'ansible_search_path' from source: unknown 11762 1726853293.48703: variable 'lsr_setup' from source: include params 11762 1726853293.48913: variable 'lsr_setup' from source: include params 11762 1726853293.48986: variable 'omit' from source: magic vars 11762 1726853293.49143: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.49159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.49177: variable 'omit' from source: magic vars 11762 1726853293.49425: variable 'ansible_distribution_major_version' from source: facts 11762 1726853293.49443: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853293.49459: variable 'item' from source: unknown 11762 1726853293.49523: variable 'item' from source: unknown 11762 1726853293.49567: variable 'item' from source: unknown 11762 1726853293.49630: variable 'item' from source: unknown 11762 1726853293.50079: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.50085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.50088: variable 'omit' from source: magic vars 11762 1726853293.50090: variable 'ansible_distribution_major_version' from source: facts 11762 1726853293.50092: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853293.50094: variable 'item' from source: unknown 11762 1726853293.50096: variable 'item' from source: unknown 11762 1726853293.50124: variable 'item' from source: unknown 11762 1726853293.50195: variable 'item' from source: unknown 11762 1726853293.50376: dumping result to json 11762 1726853293.50380: done dumping result, returning 11762 1726853293.50382: done running TaskExecutor() for managed_node2/TASK: Setup [02083763-bbaf-d845-03d0-0000000008ed] 11762 1726853293.50385: sending task result for task 02083763-bbaf-d845-03d0-0000000008ed 11762 1726853293.50452: no more pending results, returning what we have 11762 1726853293.50457: in VariableManager get_vars() 11762 1726853293.50511: Calling all_inventory to load vars for managed_node2 11762 1726853293.50514: Calling groups_inventory to load vars for managed_node2 11762 1726853293.50517: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853293.50523: done sending task result for task 02083763-bbaf-d845-03d0-0000000008ed 11762 1726853293.50526: WORKER PROCESS EXITING 11762 1726853293.50777: Calling all_plugins_play to load vars for managed_node2 11762 1726853293.50781: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853293.50784: Calling groups_plugins_play to load vars for managed_node2 11762 1726853293.52101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853293.53838: done with get_vars() 11762 1726853293.53856: variable 'ansible_search_path' from source: unknown 11762 1726853293.53858: variable 'ansible_search_path' from source: unknown 11762 1726853293.53905: variable 'ansible_search_path' from source: unknown 11762 1726853293.53907: variable 'ansible_search_path' from source: unknown 11762 1726853293.53932: we have included files to process 11762 1726853293.53933: generating all_blocks data 11762 1726853293.53935: done generating all_blocks data 11762 1726853293.53940: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11762 1726853293.53941: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11762 1726853293.53943: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11762 1726853293.54885: done processing included file 11762 1726853293.54887: iterating over new_blocks loaded from include file 11762 1726853293.54889: in VariableManager get_vars() 11762 1726853293.54906: done with get_vars() 11762 1726853293.54908: filtering new block on tags 11762 1726853293.54961: done filtering new block on tags 11762 1726853293.54964: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed_node2 => (item=tasks/create_test_interfaces_with_dhcp.yml) 11762 1726853293.54969: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11762 1726853293.54970: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11762 1726853293.54978: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11762 1726853293.55059: in VariableManager get_vars() 11762 1726853293.55086: done with get_vars() 11762 1726853293.55093: variable 'item' from source: include params 11762 1726853293.55192: variable 'item' from source: include params 11762 1726853293.55221: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11762 1726853293.55295: in VariableManager get_vars() 11762 1726853293.55321: done with get_vars() 11762 1726853293.55443: in VariableManager get_vars() 11762 1726853293.55464: done with get_vars() 11762 1726853293.55469: variable 'item' from source: include params 11762 1726853293.55535: variable 'item' from source: include params 11762 1726853293.55564: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11762 1726853293.55705: in VariableManager get_vars() 11762 1726853293.55727: done with get_vars() 11762 1726853293.55828: done processing included file 11762 1726853293.55829: iterating over new_blocks loaded from include file 11762 1726853293.55831: in VariableManager get_vars() 11762 1726853293.55851: done with get_vars() 11762 1726853293.55853: filtering new block on tags 11762 1726853293.55925: done filtering new block on tags 11762 1726853293.55928: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed_node2 => (item=tasks/assert_dhcp_device_present.yml) 11762 1726853293.55932: extending task lists for all hosts with included blocks 11762 1726853293.56525: done extending task lists 11762 1726853293.56526: done processing included files 11762 1726853293.56527: results queue empty 11762 1726853293.56527: checking for any_errors_fatal 11762 1726853293.56531: done checking for any_errors_fatal 11762 1726853293.56531: checking for max_fail_percentage 11762 1726853293.56532: done checking for max_fail_percentage 11762 1726853293.56533: checking to see if all hosts have failed and the running result is not ok 11762 1726853293.56534: done checking to see if all hosts have failed 11762 1726853293.56540: getting the remaining hosts for this loop 11762 1726853293.56541: done getting the remaining hosts for this loop 11762 1726853293.56543: getting the next task for host managed_node2 11762 1726853293.56547: done getting next task for host managed_node2 11762 1726853293.56549: ^ task is: TASK: Install dnsmasq 11762 1726853293.56552: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853293.56554: getting variables 11762 1726853293.56555: in VariableManager get_vars() 11762 1726853293.56565: Calling all_inventory to load vars for managed_node2 11762 1726853293.56567: Calling groups_inventory to load vars for managed_node2 11762 1726853293.56569: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853293.56576: Calling all_plugins_play to load vars for managed_node2 11762 1726853293.56579: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853293.56581: Calling groups_plugins_play to load vars for managed_node2 11762 1726853293.57730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853293.59311: done with get_vars() 11762 1726853293.59332: done getting variables 11762 1726853293.59374: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:28:13 -0400 (0:00:00.115) 0:00:44.024 ****** 11762 1726853293.59402: entering _queue_task() for managed_node2/package 11762 1726853293.59801: worker is 1 (out of 1 available) 11762 1726853293.59816: exiting _queue_task() for managed_node2/package 11762 1726853293.59832: done queuing things up, now waiting for results queue to drain 11762 1726853293.59834: waiting for pending results... 11762 1726853293.60110: running TaskExecutor() for managed_node2/TASK: Install dnsmasq 11762 1726853293.60299: in run() - task 02083763-bbaf-d845-03d0-000000000974 11762 1726853293.60303: variable 'ansible_search_path' from source: unknown 11762 1726853293.60305: variable 'ansible_search_path' from source: unknown 11762 1726853293.60308: calling self._execute() 11762 1726853293.60393: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.60412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.60425: variable 'omit' from source: magic vars 11762 1726853293.60807: variable 'ansible_distribution_major_version' from source: facts 11762 1726853293.60824: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853293.60839: variable 'omit' from source: magic vars 11762 1726853293.60892: variable 'omit' from source: magic vars 11762 1726853293.61094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853293.63346: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853293.63477: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853293.63481: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853293.63507: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853293.63535: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853293.63636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853293.63669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853293.63702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853293.63753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853293.63774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853293.63889: variable '__network_is_ostree' from source: set_fact 11762 1726853293.63941: variable 'omit' from source: magic vars 11762 1726853293.63945: variable 'omit' from source: magic vars 11762 1726853293.63967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853293.64004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853293.64028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853293.64057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853293.64075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853293.64159: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853293.64162: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.64164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.64229: Set connection var ansible_timeout to 10 11762 1726853293.64238: Set connection var ansible_shell_type to sh 11762 1726853293.64249: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853293.64263: Set connection var ansible_shell_executable to /bin/sh 11762 1726853293.64281: Set connection var ansible_pipelining to False 11762 1726853293.64292: Set connection var ansible_connection to ssh 11762 1726853293.64319: variable 'ansible_shell_executable' from source: unknown 11762 1726853293.64327: variable 'ansible_connection' from source: unknown 11762 1726853293.64378: variable 'ansible_module_compression' from source: unknown 11762 1726853293.64381: variable 'ansible_shell_type' from source: unknown 11762 1726853293.64383: variable 'ansible_shell_executable' from source: unknown 11762 1726853293.64385: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853293.64387: variable 'ansible_pipelining' from source: unknown 11762 1726853293.64389: variable 'ansible_timeout' from source: unknown 11762 1726853293.64391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853293.64495: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853293.64511: variable 'omit' from source: magic vars 11762 1726853293.64520: starting attempt loop 11762 1726853293.64527: running the handler 11762 1726853293.64595: variable 'ansible_facts' from source: unknown 11762 1726853293.64598: variable 'ansible_facts' from source: unknown 11762 1726853293.64600: _low_level_execute_command(): starting 11762 1726853293.64602: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853293.65309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853293.65373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853293.65435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853293.65478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853293.65721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853293.67445: stdout chunk (state=3): >>>/root <<< 11762 1726853293.67584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853293.67589: stdout chunk (state=3): >>><<< 11762 1726853293.67602: stderr chunk (state=3): >>><<< 11762 1726853293.67651: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853293.67662: _low_level_execute_command(): starting 11762 1726853293.67666: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996 `" && echo ansible-tmp-1726853293.6762373-13931-135621560625996="` echo /root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996 `" ) && sleep 0' 11762 1726853293.68711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853293.68877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853293.68881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853293.68883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853293.68886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853293.69065: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853293.69093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853293.69179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853293.69187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853293.69295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853293.71332: stdout chunk (state=3): >>>ansible-tmp-1726853293.6762373-13931-135621560625996=/root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996 <<< 11762 1726853293.71436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853293.71495: stderr chunk (state=3): >>><<< 11762 1726853293.71498: stdout chunk (state=3): >>><<< 11762 1726853293.71569: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853293.6762373-13931-135621560625996=/root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853293.71639: variable 'ansible_module_compression' from source: unknown 11762 1726853293.71662: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11762 1726853293.71711: variable 'ansible_facts' from source: unknown 11762 1726853293.72048: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996/AnsiballZ_dnf.py 11762 1726853293.72609: Sending initial data 11762 1726853293.72612: Sent initial data (152 bytes) 11762 1726853293.73834: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853293.73966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853293.74069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853293.75734: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11762 1726853293.75739: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853293.75806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853293.75882: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpydhbhf1u /root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996/AnsiballZ_dnf.py <<< 11762 1726853293.75886: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996/AnsiballZ_dnf.py" <<< 11762 1726853293.76031: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpydhbhf1u" to remote "/root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996/AnsiballZ_dnf.py" <<< 11762 1726853293.77774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853293.77826: stderr chunk (state=3): >>><<< 11762 1726853293.77829: stdout chunk (state=3): >>><<< 11762 1726853293.77859: done transferring module to remote 11762 1726853293.77985: _low_level_execute_command(): starting 11762 1726853293.77991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996/ /root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996/AnsiballZ_dnf.py && sleep 0' 11762 1726853293.79363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853293.79398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853293.79403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853293.79421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853293.79605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853293.81510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853293.81559: stderr chunk (state=3): >>><<< 11762 1726853293.81563: stdout chunk (state=3): >>><<< 11762 1726853293.81583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853293.81586: _low_level_execute_command(): starting 11762 1726853293.81591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996/AnsiballZ_dnf.py && sleep 0' 11762 1726853293.82888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853293.83091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853293.83202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853293.83230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853293.83254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853293.83441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853294.26454: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11762 1726853294.30759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853294.30782: stderr chunk (state=3): >>><<< 11762 1726853294.30787: stdout chunk (state=3): >>><<< 11762 1726853294.30802: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853294.30839: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853294.30847: _low_level_execute_command(): starting 11762 1726853294.30852: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853293.6762373-13931-135621560625996/ > /dev/null 2>&1 && sleep 0' 11762 1726853294.31276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853294.31289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853294.31297: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.31309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.31363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853294.31369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853294.31446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853294.33461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853294.33467: stdout chunk (state=3): >>><<< 11762 1726853294.33470: stderr chunk (state=3): >>><<< 11762 1726853294.33537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853294.33541: handler run complete 11762 1726853294.33761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853294.34163: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853294.34199: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853294.34222: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853294.34244: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853294.34307: variable '__install_status' from source: set_fact 11762 1726853294.34327: Evaluated conditional (__install_status is success): True 11762 1726853294.34358: attempt loop complete, returning result 11762 1726853294.34361: _execute() done 11762 1726853294.34364: dumping result to json 11762 1726853294.34366: done dumping result, returning 11762 1726853294.34368: done running TaskExecutor() for managed_node2/TASK: Install dnsmasq [02083763-bbaf-d845-03d0-000000000974] 11762 1726853294.34370: sending task result for task 02083763-bbaf-d845-03d0-000000000974 ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11762 1726853294.34575: no more pending results, returning what we have 11762 1726853294.34580: results queue empty 11762 1726853294.34581: checking for any_errors_fatal 11762 1726853294.34582: done checking for any_errors_fatal 11762 1726853294.34583: checking for max_fail_percentage 11762 1726853294.34584: done checking for max_fail_percentage 11762 1726853294.34585: checking to see if all hosts have failed and the running result is not ok 11762 1726853294.34586: done checking to see if all hosts have failed 11762 1726853294.34586: getting the remaining hosts for this loop 11762 1726853294.34589: done getting the remaining hosts for this loop 11762 1726853294.34597: getting the next task for host managed_node2 11762 1726853294.34604: done getting next task for host managed_node2 11762 1726853294.34606: ^ task is: TASK: Install pgrep, sysctl 11762 1726853294.34609: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853294.34612: getting variables 11762 1726853294.34614: in VariableManager get_vars() 11762 1726853294.34709: Calling all_inventory to load vars for managed_node2 11762 1726853294.34712: Calling groups_inventory to load vars for managed_node2 11762 1726853294.34714: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853294.34782: done sending task result for task 02083763-bbaf-d845-03d0-000000000974 11762 1726853294.34786: WORKER PROCESS EXITING 11762 1726853294.34798: Calling all_plugins_play to load vars for managed_node2 11762 1726853294.34801: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853294.34803: Calling groups_plugins_play to load vars for managed_node2 11762 1726853294.35835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853294.37040: done with get_vars() 11762 1726853294.37066: done getting variables 11762 1726853294.37126: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 13:28:14 -0400 (0:00:00.777) 0:00:44.802 ****** 11762 1726853294.37164: entering _queue_task() for managed_node2/package 11762 1726853294.37506: worker is 1 (out of 1 available) 11762 1726853294.37520: exiting _queue_task() for managed_node2/package 11762 1726853294.37534: done queuing things up, now waiting for results queue to drain 11762 1726853294.37536: waiting for pending results... 11762 1726853294.37852: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 11762 1726853294.37967: in run() - task 02083763-bbaf-d845-03d0-000000000975 11762 1726853294.37972: variable 'ansible_search_path' from source: unknown 11762 1726853294.37976: variable 'ansible_search_path' from source: unknown 11762 1726853294.37979: calling self._execute() 11762 1726853294.38058: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853294.38062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853294.38075: variable 'omit' from source: magic vars 11762 1726853294.38584: variable 'ansible_distribution_major_version' from source: facts 11762 1726853294.38588: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853294.38607: variable 'ansible_os_family' from source: facts 11762 1726853294.38611: Evaluated conditional (ansible_os_family == 'RedHat'): True 11762 1726853294.38809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853294.39054: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853294.39088: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853294.39112: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853294.39139: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853294.39196: variable 'ansible_distribution_major_version' from source: facts 11762 1726853294.39206: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11762 1726853294.39210: when evaluation is False, skipping this task 11762 1726853294.39212: _execute() done 11762 1726853294.39215: dumping result to json 11762 1726853294.39217: done dumping result, returning 11762 1726853294.39223: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [02083763-bbaf-d845-03d0-000000000975] 11762 1726853294.39234: sending task result for task 02083763-bbaf-d845-03d0-000000000975 11762 1726853294.39326: done sending task result for task 02083763-bbaf-d845-03d0-000000000975 11762 1726853294.39329: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11762 1726853294.39412: no more pending results, returning what we have 11762 1726853294.39417: results queue empty 11762 1726853294.39418: checking for any_errors_fatal 11762 1726853294.39428: done checking for any_errors_fatal 11762 1726853294.39429: checking for max_fail_percentage 11762 1726853294.39431: done checking for max_fail_percentage 11762 1726853294.39432: checking to see if all hosts have failed and the running result is not ok 11762 1726853294.39432: done checking to see if all hosts have failed 11762 1726853294.39433: getting the remaining hosts for this loop 11762 1726853294.39435: done getting the remaining hosts for this loop 11762 1726853294.39438: getting the next task for host managed_node2 11762 1726853294.39445: done getting next task for host managed_node2 11762 1726853294.39447: ^ task is: TASK: Install pgrep, sysctl 11762 1726853294.39451: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853294.39454: getting variables 11762 1726853294.39455: in VariableManager get_vars() 11762 1726853294.39491: Calling all_inventory to load vars for managed_node2 11762 1726853294.39494: Calling groups_inventory to load vars for managed_node2 11762 1726853294.39496: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853294.39505: Calling all_plugins_play to load vars for managed_node2 11762 1726853294.39507: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853294.39509: Calling groups_plugins_play to load vars for managed_node2 11762 1726853294.40411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853294.41457: done with get_vars() 11762 1726853294.41477: done getting variables 11762 1726853294.41535: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 13:28:14 -0400 (0:00:00.043) 0:00:44.846 ****** 11762 1726853294.41561: entering _queue_task() for managed_node2/package 11762 1726853294.41886: worker is 1 (out of 1 available) 11762 1726853294.41902: exiting _queue_task() for managed_node2/package 11762 1726853294.41915: done queuing things up, now waiting for results queue to drain 11762 1726853294.41917: waiting for pending results... 11762 1726853294.42429: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 11762 1726853294.42487: in run() - task 02083763-bbaf-d845-03d0-000000000976 11762 1726853294.42580: variable 'ansible_search_path' from source: unknown 11762 1726853294.42629: variable 'ansible_search_path' from source: unknown 11762 1726853294.42632: calling self._execute() 11762 1726853294.42640: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853294.42644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853294.42646: variable 'omit' from source: magic vars 11762 1726853294.42946: variable 'ansible_distribution_major_version' from source: facts 11762 1726853294.42994: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853294.43082: variable 'ansible_os_family' from source: facts 11762 1726853294.43121: Evaluated conditional (ansible_os_family == 'RedHat'): True 11762 1726853294.43265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853294.43465: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853294.43502: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853294.43526: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853294.43555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853294.43626: variable 'ansible_distribution_major_version' from source: facts 11762 1726853294.43636: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11762 1726853294.43642: variable 'omit' from source: magic vars 11762 1726853294.43678: variable 'omit' from source: magic vars 11762 1726853294.43796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853294.45468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853294.45517: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853294.45549: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853294.45574: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853294.45594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853294.45664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853294.45687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853294.45704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853294.45729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853294.45739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853294.45811: variable '__network_is_ostree' from source: set_fact 11762 1726853294.45815: variable 'omit' from source: magic vars 11762 1726853294.45839: variable 'omit' from source: magic vars 11762 1726853294.45863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853294.45886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853294.45901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853294.45913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853294.45922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853294.45945: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853294.45951: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853294.45954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853294.46024: Set connection var ansible_timeout to 10 11762 1726853294.46027: Set connection var ansible_shell_type to sh 11762 1726853294.46032: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853294.46037: Set connection var ansible_shell_executable to /bin/sh 11762 1726853294.46046: Set connection var ansible_pipelining to False 11762 1726853294.46052: Set connection var ansible_connection to ssh 11762 1726853294.46070: variable 'ansible_shell_executable' from source: unknown 11762 1726853294.46076: variable 'ansible_connection' from source: unknown 11762 1726853294.46080: variable 'ansible_module_compression' from source: unknown 11762 1726853294.46082: variable 'ansible_shell_type' from source: unknown 11762 1726853294.46086: variable 'ansible_shell_executable' from source: unknown 11762 1726853294.46089: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853294.46091: variable 'ansible_pipelining' from source: unknown 11762 1726853294.46093: variable 'ansible_timeout' from source: unknown 11762 1726853294.46106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853294.46168: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853294.46179: variable 'omit' from source: magic vars 11762 1726853294.46184: starting attempt loop 11762 1726853294.46187: running the handler 11762 1726853294.46194: variable 'ansible_facts' from source: unknown 11762 1726853294.46196: variable 'ansible_facts' from source: unknown 11762 1726853294.46227: _low_level_execute_command(): starting 11762 1726853294.46233: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853294.46731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853294.46735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.46738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853294.46740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.46801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853294.46804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853294.46806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853294.46892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853294.48625: stdout chunk (state=3): >>>/root <<< 11762 1726853294.48724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853294.48755: stderr chunk (state=3): >>><<< 11762 1726853294.48759: stdout chunk (state=3): >>><<< 11762 1726853294.48779: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853294.48793: _low_level_execute_command(): starting 11762 1726853294.48799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377 `" && echo ansible-tmp-1726853294.4877965-13964-217327930702377="` echo /root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377 `" ) && sleep 0' 11762 1726853294.49264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853294.49267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.49269: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853294.49274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.49319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853294.49322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853294.49327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853294.49395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853294.51339: stdout chunk (state=3): >>>ansible-tmp-1726853294.4877965-13964-217327930702377=/root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377 <<< 11762 1726853294.51449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853294.51479: stderr chunk (state=3): >>><<< 11762 1726853294.51483: stdout chunk (state=3): >>><<< 11762 1726853294.51497: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853294.4877965-13964-217327930702377=/root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853294.51528: variable 'ansible_module_compression' from source: unknown 11762 1726853294.51574: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11762 1726853294.51608: variable 'ansible_facts' from source: unknown 11762 1726853294.51693: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377/AnsiballZ_dnf.py 11762 1726853294.51798: Sending initial data 11762 1726853294.51801: Sent initial data (152 bytes) 11762 1726853294.52230: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853294.52237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853294.52263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.52266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853294.52269: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853294.52274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.52329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853294.52332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853294.52338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853294.52410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853294.54037: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11762 1726853294.54045: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853294.54110: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853294.54177: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpbckeukw2 /root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377/AnsiballZ_dnf.py <<< 11762 1726853294.54181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377/AnsiballZ_dnf.py" <<< 11762 1726853294.54245: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpbckeukw2" to remote "/root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377/AnsiballZ_dnf.py" <<< 11762 1726853294.54248: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377/AnsiballZ_dnf.py" <<< 11762 1726853294.55012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853294.55054: stderr chunk (state=3): >>><<< 11762 1726853294.55058: stdout chunk (state=3): >>><<< 11762 1726853294.55102: done transferring module to remote 11762 1726853294.55111: _low_level_execute_command(): starting 11762 1726853294.55116: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377/ /root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377/AnsiballZ_dnf.py && sleep 0' 11762 1726853294.55537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853294.55567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853294.55570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.55576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853294.55578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.55626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853294.55630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853294.55701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853294.57558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853294.57587: stderr chunk (state=3): >>><<< 11762 1726853294.57590: stdout chunk (state=3): >>><<< 11762 1726853294.57604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853294.57607: _low_level_execute_command(): starting 11762 1726853294.57612: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377/AnsiballZ_dnf.py && sleep 0' 11762 1726853294.58028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853294.58057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.58060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853294.58062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853294.58116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853294.58124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853294.58126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853294.58196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853295.00301: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11762 1726853295.04878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853295.04882: stdout chunk (state=3): >>><<< 11762 1726853295.04885: stderr chunk (state=3): >>><<< 11762 1726853295.04888: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853295.04895: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853295.04898: _low_level_execute_command(): starting 11762 1726853295.04900: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853294.4877965-13964-217327930702377/ > /dev/null 2>&1 && sleep 0' 11762 1726853295.05714: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853295.05784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853295.05790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853295.05814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853295.05913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853295.07860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853295.07877: stdout chunk (state=3): >>><<< 11762 1726853295.07904: stderr chunk (state=3): >>><<< 11762 1726853295.07924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853295.07937: handler run complete 11762 1726853295.07985: attempt loop complete, returning result 11762 1726853295.08009: _execute() done 11762 1726853295.08030: dumping result to json 11762 1726853295.08105: done dumping result, returning 11762 1726853295.08152: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [02083763-bbaf-d845-03d0-000000000976] 11762 1726853295.08195: sending task result for task 02083763-bbaf-d845-03d0-000000000976 ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11762 1726853295.08686: no more pending results, returning what we have 11762 1726853295.08695: results queue empty 11762 1726853295.08696: checking for any_errors_fatal 11762 1726853295.08729: done checking for any_errors_fatal 11762 1726853295.08730: checking for max_fail_percentage 11762 1726853295.08733: done checking for max_fail_percentage 11762 1726853295.08734: checking to see if all hosts have failed and the running result is not ok 11762 1726853295.08734: done checking to see if all hosts have failed 11762 1726853295.08735: getting the remaining hosts for this loop 11762 1726853295.08737: done getting the remaining hosts for this loop 11762 1726853295.08741: getting the next task for host managed_node2 11762 1726853295.08779: done getting next task for host managed_node2 11762 1726853295.08786: ^ task is: TASK: Create test interfaces 11762 1726853295.08884: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853295.08891: getting variables 11762 1726853295.08893: in VariableManager get_vars() 11762 1726853295.08954: Calling all_inventory to load vars for managed_node2 11762 1726853295.08957: Calling groups_inventory to load vars for managed_node2 11762 1726853295.08960: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853295.08966: done sending task result for task 02083763-bbaf-d845-03d0-000000000976 11762 1726853295.08970: WORKER PROCESS EXITING 11762 1726853295.08981: Calling all_plugins_play to load vars for managed_node2 11762 1726853295.08985: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853295.08988: Calling groups_plugins_play to load vars for managed_node2 11762 1726853295.11665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853295.14827: done with get_vars() 11762 1726853295.14855: done getting variables 11762 1726853295.15057: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 13:28:15 -0400 (0:00:00.736) 0:00:45.582 ****** 11762 1726853295.15214: entering _queue_task() for managed_node2/shell 11762 1726853295.15640: worker is 1 (out of 1 available) 11762 1726853295.15653: exiting _queue_task() for managed_node2/shell 11762 1726853295.15664: done queuing things up, now waiting for results queue to drain 11762 1726853295.15666: waiting for pending results... 11762 1726853295.16350: running TaskExecutor() for managed_node2/TASK: Create test interfaces 11762 1726853295.16669: in run() - task 02083763-bbaf-d845-03d0-000000000977 11762 1726853295.16893: variable 'ansible_search_path' from source: unknown 11762 1726853295.16896: variable 'ansible_search_path' from source: unknown 11762 1726853295.16985: calling self._execute() 11762 1726853295.17030: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853295.17036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853295.17048: variable 'omit' from source: magic vars 11762 1726853295.19084: variable 'ansible_distribution_major_version' from source: facts 11762 1726853295.19094: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853295.19101: variable 'omit' from source: magic vars 11762 1726853295.19176: variable 'omit' from source: magic vars 11762 1726853295.20063: variable 'dhcp_interface1' from source: play vars 11762 1726853295.20067: variable 'dhcp_interface2' from source: play vars 11762 1726853295.20070: variable 'omit' from source: magic vars 11762 1726853295.20279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853295.20283: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853295.20286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853295.20390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853295.20404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853295.20494: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853295.20499: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853295.20502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853295.20711: Set connection var ansible_timeout to 10 11762 1726853295.20715: Set connection var ansible_shell_type to sh 11762 1726853295.20718: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853295.20720: Set connection var ansible_shell_executable to /bin/sh 11762 1726853295.20818: Set connection var ansible_pipelining to False 11762 1726853295.20822: Set connection var ansible_connection to ssh 11762 1726853295.20825: variable 'ansible_shell_executable' from source: unknown 11762 1726853295.20827: variable 'ansible_connection' from source: unknown 11762 1726853295.20829: variable 'ansible_module_compression' from source: unknown 11762 1726853295.20831: variable 'ansible_shell_type' from source: unknown 11762 1726853295.20834: variable 'ansible_shell_executable' from source: unknown 11762 1726853295.20836: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853295.20838: variable 'ansible_pipelining' from source: unknown 11762 1726853295.20845: variable 'ansible_timeout' from source: unknown 11762 1726853295.20848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853295.21176: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853295.21188: variable 'omit' from source: magic vars 11762 1726853295.21194: starting attempt loop 11762 1726853295.21196: running the handler 11762 1726853295.21228: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853295.21231: _low_level_execute_command(): starting 11762 1726853295.21451: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853295.22882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853295.23092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853295.23303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853295.25025: stdout chunk (state=3): >>>/root <<< 11762 1726853295.25172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853295.25176: stdout chunk (state=3): >>><<< 11762 1726853295.25213: stderr chunk (state=3): >>><<< 11762 1726853295.25216: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853295.25220: _low_level_execute_command(): starting 11762 1726853295.25223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998 `" && echo ansible-tmp-1726853295.2520297-13989-229042778824998="` echo /root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998 `" ) && sleep 0' 11762 1726853295.26022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853295.26026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853295.26029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853295.26040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853295.26045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853295.26049: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853295.26051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853295.26054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853295.26073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853295.26085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853295.26093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853295.26220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853295.28189: stdout chunk (state=3): >>>ansible-tmp-1726853295.2520297-13989-229042778824998=/root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998 <<< 11762 1726853295.28294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853295.28327: stderr chunk (state=3): >>><<< 11762 1726853295.28330: stdout chunk (state=3): >>><<< 11762 1726853295.28341: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853295.2520297-13989-229042778824998=/root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853295.28368: variable 'ansible_module_compression' from source: unknown 11762 1726853295.28413: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853295.28448: variable 'ansible_facts' from source: unknown 11762 1726853295.28504: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998/AnsiballZ_command.py 11762 1726853295.28605: Sending initial data 11762 1726853295.28609: Sent initial data (156 bytes) 11762 1726853295.29219: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853295.29223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853295.29238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853295.29334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853295.31059: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853295.31099: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853295.31224: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp_8nd1dhj /root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998/AnsiballZ_command.py <<< 11762 1726853295.31227: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998/AnsiballZ_command.py" <<< 11762 1726853295.31286: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp_8nd1dhj" to remote "/root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998/AnsiballZ_command.py" <<< 11762 1726853295.32437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853295.32523: stderr chunk (state=3): >>><<< 11762 1726853295.32578: stdout chunk (state=3): >>><<< 11762 1726853295.32589: done transferring module to remote 11762 1726853295.32605: _low_level_execute_command(): starting 11762 1726853295.32614: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998/ /root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998/AnsiballZ_command.py && sleep 0' 11762 1726853295.33616: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853295.33794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853295.33812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853295.33834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853295.33934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853295.35866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853295.35922: stdout chunk (state=3): >>><<< 11762 1726853295.35934: stderr chunk (state=3): >>><<< 11762 1726853295.36078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853295.36081: _low_level_execute_command(): starting 11762 1726853295.36086: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998/AnsiballZ_command.py && sleep 0' 11762 1726853295.36982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853295.36989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853295.37011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853295.37016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853295.37056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853295.37060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853295.37062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853295.37123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853295.37129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853295.37146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853295.37252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853296.75406: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6954 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6954 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:28:15.526974", "end": "2024-09-20 13:28:16.751808", "delta": "0:00:01.224834", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853296.77186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853296.77190: stdout chunk (state=3): >>><<< 11762 1726853296.77192: stderr chunk (state=3): >>><<< 11762 1726853296.77195: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6954 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6954 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:28:15.526974", "end": "2024-09-20 13:28:16.751808", "delta": "0:00:01.224834", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853296.77204: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853296.77206: _low_level_execute_command(): starting 11762 1726853296.77208: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853295.2520297-13989-229042778824998/ > /dev/null 2>&1 && sleep 0' 11762 1726853296.78138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853296.78156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853296.78170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853296.78274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853296.78491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853296.78574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853296.80566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853296.80585: stdout chunk (state=3): >>><<< 11762 1726853296.80606: stderr chunk (state=3): >>><<< 11762 1726853296.80629: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853296.80644: handler run complete 11762 1726853296.80686: Evaluated conditional (False): False 11762 1726853296.80713: attempt loop complete, returning result 11762 1726853296.80720: _execute() done 11762 1726853296.80727: dumping result to json 11762 1726853296.80738: done dumping result, returning 11762 1726853296.80754: done running TaskExecutor() for managed_node2/TASK: Create test interfaces [02083763-bbaf-d845-03d0-000000000977] 11762 1726853296.80765: sending task result for task 02083763-bbaf-d845-03d0-000000000977 11762 1726853296.81066: done sending task result for task 02083763-bbaf-d845-03d0-000000000977 11762 1726853296.81070: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.224834", "end": "2024-09-20 13:28:16.751808", "rc": 0, "start": "2024-09-20 13:28:15.526974" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 6954 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 6954 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11762 1726853296.81354: no more pending results, returning what we have 11762 1726853296.81359: results queue empty 11762 1726853296.81360: checking for any_errors_fatal 11762 1726853296.81370: done checking for any_errors_fatal 11762 1726853296.81373: checking for max_fail_percentage 11762 1726853296.81376: done checking for max_fail_percentage 11762 1726853296.81377: checking to see if all hosts have failed and the running result is not ok 11762 1726853296.81378: done checking to see if all hosts have failed 11762 1726853296.81378: getting the remaining hosts for this loop 11762 1726853296.81380: done getting the remaining hosts for this loop 11762 1726853296.81384: getting the next task for host managed_node2 11762 1726853296.81396: done getting next task for host managed_node2 11762 1726853296.81398: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11762 1726853296.81403: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853296.81407: getting variables 11762 1726853296.81409: in VariableManager get_vars() 11762 1726853296.81801: Calling all_inventory to load vars for managed_node2 11762 1726853296.81804: Calling groups_inventory to load vars for managed_node2 11762 1726853296.81807: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853296.81817: Calling all_plugins_play to load vars for managed_node2 11762 1726853296.81820: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853296.81823: Calling groups_plugins_play to load vars for managed_node2 11762 1726853296.84285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853296.93175: done with get_vars() 11762 1726853296.93213: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:28:16 -0400 (0:00:01.780) 0:00:47.363 ****** 11762 1726853296.93311: entering _queue_task() for managed_node2/include_tasks 11762 1726853296.93695: worker is 1 (out of 1 available) 11762 1726853296.93709: exiting _queue_task() for managed_node2/include_tasks 11762 1726853296.93721: done queuing things up, now waiting for results queue to drain 11762 1726853296.93723: waiting for pending results... 11762 1726853296.94198: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11762 1726853296.94216: in run() - task 02083763-bbaf-d845-03d0-00000000097e 11762 1726853296.94237: variable 'ansible_search_path' from source: unknown 11762 1726853296.94246: variable 'ansible_search_path' from source: unknown 11762 1726853296.94306: calling self._execute() 11762 1726853296.94419: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853296.94433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853296.94446: variable 'omit' from source: magic vars 11762 1726853296.95162: variable 'ansible_distribution_major_version' from source: facts 11762 1726853296.95166: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853296.95169: _execute() done 11762 1726853296.95274: dumping result to json 11762 1726853296.95279: done dumping result, returning 11762 1726853296.95281: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-d845-03d0-00000000097e] 11762 1726853296.95283: sending task result for task 02083763-bbaf-d845-03d0-00000000097e 11762 1726853296.95356: done sending task result for task 02083763-bbaf-d845-03d0-00000000097e 11762 1726853296.95359: WORKER PROCESS EXITING 11762 1726853296.95403: no more pending results, returning what we have 11762 1726853296.95408: in VariableManager get_vars() 11762 1726853296.95456: Calling all_inventory to load vars for managed_node2 11762 1726853296.95459: Calling groups_inventory to load vars for managed_node2 11762 1726853296.95461: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853296.95476: Calling all_plugins_play to load vars for managed_node2 11762 1726853296.95479: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853296.95543: Calling groups_plugins_play to load vars for managed_node2 11762 1726853296.97236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853296.99845: done with get_vars() 11762 1726853296.99875: variable 'ansible_search_path' from source: unknown 11762 1726853296.99877: variable 'ansible_search_path' from source: unknown 11762 1726853296.99915: we have included files to process 11762 1726853296.99916: generating all_blocks data 11762 1726853296.99920: done generating all_blocks data 11762 1726853296.99926: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853296.99927: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853296.99930: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853297.00135: done processing included file 11762 1726853297.00137: iterating over new_blocks loaded from include file 11762 1726853297.00139: in VariableManager get_vars() 11762 1726853297.00168: done with get_vars() 11762 1726853297.00170: filtering new block on tags 11762 1726853297.00205: done filtering new block on tags 11762 1726853297.00208: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11762 1726853297.00213: extending task lists for all hosts with included blocks 11762 1726853297.00438: done extending task lists 11762 1726853297.00440: done processing included files 11762 1726853297.00441: results queue empty 11762 1726853297.00441: checking for any_errors_fatal 11762 1726853297.00450: done checking for any_errors_fatal 11762 1726853297.00451: checking for max_fail_percentage 11762 1726853297.00452: done checking for max_fail_percentage 11762 1726853297.00453: checking to see if all hosts have failed and the running result is not ok 11762 1726853297.00453: done checking to see if all hosts have failed 11762 1726853297.00454: getting the remaining hosts for this loop 11762 1726853297.00455: done getting the remaining hosts for this loop 11762 1726853297.00458: getting the next task for host managed_node2 11762 1726853297.00462: done getting next task for host managed_node2 11762 1726853297.00464: ^ task is: TASK: Get stat for interface {{ interface }} 11762 1726853297.00467: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853297.00469: getting variables 11762 1726853297.00470: in VariableManager get_vars() 11762 1726853297.00482: Calling all_inventory to load vars for managed_node2 11762 1726853297.00484: Calling groups_inventory to load vars for managed_node2 11762 1726853297.00486: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853297.00497: Calling all_plugins_play to load vars for managed_node2 11762 1726853297.00499: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853297.00502: Calling groups_plugins_play to load vars for managed_node2 11762 1726853297.03477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853297.07721: done with get_vars() 11762 1726853297.07754: done getting variables 11762 1726853297.08141: variable 'interface' from source: task vars 11762 1726853297.08148: variable 'dhcp_interface1' from source: play vars 11762 1726853297.08233: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:28:17 -0400 (0:00:00.149) 0:00:47.513 ****** 11762 1726853297.08273: entering _queue_task() for managed_node2/stat 11762 1726853297.08778: worker is 1 (out of 1 available) 11762 1726853297.08790: exiting _queue_task() for managed_node2/stat 11762 1726853297.08802: done queuing things up, now waiting for results queue to drain 11762 1726853297.08804: waiting for pending results... 11762 1726853297.09094: running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 11762 1726853297.09231: in run() - task 02083763-bbaf-d845-03d0-0000000009dd 11762 1726853297.09255: variable 'ansible_search_path' from source: unknown 11762 1726853297.09295: variable 'ansible_search_path' from source: unknown 11762 1726853297.09318: calling self._execute() 11762 1726853297.09428: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853297.09439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853297.09456: variable 'omit' from source: magic vars 11762 1726853297.09951: variable 'ansible_distribution_major_version' from source: facts 11762 1726853297.09956: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853297.09959: variable 'omit' from source: magic vars 11762 1726853297.09991: variable 'omit' from source: magic vars 11762 1726853297.10130: variable 'interface' from source: task vars 11762 1726853297.10149: variable 'dhcp_interface1' from source: play vars 11762 1726853297.10226: variable 'dhcp_interface1' from source: play vars 11762 1726853297.10278: variable 'omit' from source: magic vars 11762 1726853297.10612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853297.10616: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853297.10620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853297.10720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853297.10729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853297.10733: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853297.10735: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853297.10738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853297.10976: Set connection var ansible_timeout to 10 11762 1726853297.11084: Set connection var ansible_shell_type to sh 11762 1726853297.11176: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853297.11180: Set connection var ansible_shell_executable to /bin/sh 11762 1726853297.11183: Set connection var ansible_pipelining to False 11762 1726853297.11187: Set connection var ansible_connection to ssh 11762 1726853297.11190: variable 'ansible_shell_executable' from source: unknown 11762 1726853297.11193: variable 'ansible_connection' from source: unknown 11762 1726853297.11195: variable 'ansible_module_compression' from source: unknown 11762 1726853297.11197: variable 'ansible_shell_type' from source: unknown 11762 1726853297.11199: variable 'ansible_shell_executable' from source: unknown 11762 1726853297.11201: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853297.11204: variable 'ansible_pipelining' from source: unknown 11762 1726853297.11207: variable 'ansible_timeout' from source: unknown 11762 1726853297.11209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853297.11688: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853297.11730: variable 'omit' from source: magic vars 11762 1726853297.11770: starting attempt loop 11762 1726853297.11780: running the handler 11762 1726853297.11800: _low_level_execute_command(): starting 11762 1726853297.11837: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853297.12931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.13002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.13051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.13137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.14861: stdout chunk (state=3): >>>/root <<< 11762 1726853297.15022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853297.15036: stderr chunk (state=3): >>><<< 11762 1726853297.15062: stdout chunk (state=3): >>><<< 11762 1726853297.15239: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853297.15246: _low_level_execute_command(): starting 11762 1726853297.15249: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016 `" && echo ansible-tmp-1726853297.151289-14066-92407109940016="` echo /root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016 `" ) && sleep 0' 11762 1726853297.16534: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853297.16704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853297.16722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853297.16747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853297.16783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853297.16983: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.17110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.17311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.19248: stdout chunk (state=3): >>>ansible-tmp-1726853297.151289-14066-92407109940016=/root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016 <<< 11762 1726853297.19393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853297.19403: stdout chunk (state=3): >>><<< 11762 1726853297.19439: stderr chunk (state=3): >>><<< 11762 1726853297.19482: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853297.151289-14066-92407109940016=/root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853297.19710: variable 'ansible_module_compression' from source: unknown 11762 1726853297.19713: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11762 1726853297.19714: variable 'ansible_facts' from source: unknown 11762 1726853297.19787: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016/AnsiballZ_stat.py 11762 1726853297.19949: Sending initial data 11762 1726853297.19983: Sent initial data (151 bytes) 11762 1726853297.20701: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853297.20812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.20832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853297.20855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.20889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.21028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.22705: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853297.22782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853297.22852: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp98hjy7lx /root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016/AnsiballZ_stat.py <<< 11762 1726853297.22864: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016/AnsiballZ_stat.py" <<< 11762 1726853297.22929: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp98hjy7lx" to remote "/root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016/AnsiballZ_stat.py" <<< 11762 1726853297.24188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853297.24268: stderr chunk (state=3): >>><<< 11762 1726853297.24287: stdout chunk (state=3): >>><<< 11762 1726853297.24352: done transferring module to remote 11762 1726853297.24384: _low_level_execute_command(): starting 11762 1726853297.24514: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016/ /root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016/AnsiballZ_stat.py && sleep 0' 11762 1726853297.25213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853297.25226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.25241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853297.25311: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.25359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.25468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.27461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853297.27465: stdout chunk (state=3): >>><<< 11762 1726853297.27467: stderr chunk (state=3): >>><<< 11762 1726853297.27485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853297.27500: _low_level_execute_command(): starting 11762 1726853297.27585: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016/AnsiballZ_stat.py && sleep 0' 11762 1726853297.28027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853297.28036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853297.28064: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.28068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853297.28078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.28147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.28189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.28290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.44425: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27781, "dev": 23, "nlink": 1, "atime": 1726853295.5338821, "mtime": 1726853295.5338821, "ctime": 1726853295.5338821, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11762 1726853297.45814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853297.45841: stderr chunk (state=3): >>><<< 11762 1726853297.45844: stdout chunk (state=3): >>><<< 11762 1726853297.45866: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27781, "dev": 23, "nlink": 1, "atime": 1726853295.5338821, "mtime": 1726853295.5338821, "ctime": 1726853295.5338821, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853297.45911: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853297.45920: _low_level_execute_command(): starting 11762 1726853297.45925: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853297.151289-14066-92407109940016/ > /dev/null 2>&1 && sleep 0' 11762 1726853297.46362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853297.46374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853297.46399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.46402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853297.46404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.46456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853297.46460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.46469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.46545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.48403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853297.48427: stderr chunk (state=3): >>><<< 11762 1726853297.48430: stdout chunk (state=3): >>><<< 11762 1726853297.48445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853297.48454: handler run complete 11762 1726853297.48487: attempt loop complete, returning result 11762 1726853297.48490: _execute() done 11762 1726853297.48493: dumping result to json 11762 1726853297.48497: done dumping result, returning 11762 1726853297.48504: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 [02083763-bbaf-d845-03d0-0000000009dd] 11762 1726853297.48509: sending task result for task 02083763-bbaf-d845-03d0-0000000009dd 11762 1726853297.48615: done sending task result for task 02083763-bbaf-d845-03d0-0000000009dd 11762 1726853297.48618: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726853295.5338821, "block_size": 4096, "blocks": 0, "ctime": 1726853295.5338821, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27781, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726853295.5338821, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11762 1726853297.48705: no more pending results, returning what we have 11762 1726853297.48708: results queue empty 11762 1726853297.48709: checking for any_errors_fatal 11762 1726853297.48710: done checking for any_errors_fatal 11762 1726853297.48711: checking for max_fail_percentage 11762 1726853297.48713: done checking for max_fail_percentage 11762 1726853297.48714: checking to see if all hosts have failed and the running result is not ok 11762 1726853297.48714: done checking to see if all hosts have failed 11762 1726853297.48715: getting the remaining hosts for this loop 11762 1726853297.48717: done getting the remaining hosts for this loop 11762 1726853297.48720: getting the next task for host managed_node2 11762 1726853297.48729: done getting next task for host managed_node2 11762 1726853297.48731: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11762 1726853297.48735: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853297.48739: getting variables 11762 1726853297.48740: in VariableManager get_vars() 11762 1726853297.48778: Calling all_inventory to load vars for managed_node2 11762 1726853297.48781: Calling groups_inventory to load vars for managed_node2 11762 1726853297.48783: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853297.48791: Calling all_plugins_play to load vars for managed_node2 11762 1726853297.48794: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853297.48796: Calling groups_plugins_play to load vars for managed_node2 11762 1726853297.49614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853297.50487: done with get_vars() 11762 1726853297.50505: done getting variables 11762 1726853297.50550: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853297.50652: variable 'interface' from source: task vars 11762 1726853297.50655: variable 'dhcp_interface1' from source: play vars 11762 1726853297.50696: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:28:17 -0400 (0:00:00.424) 0:00:47.937 ****** 11762 1726853297.50726: entering _queue_task() for managed_node2/assert 11762 1726853297.50958: worker is 1 (out of 1 available) 11762 1726853297.50974: exiting _queue_task() for managed_node2/assert 11762 1726853297.50987: done queuing things up, now waiting for results queue to drain 11762 1726853297.50989: waiting for pending results... 11762 1726853297.51174: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' 11762 1726853297.51257: in run() - task 02083763-bbaf-d845-03d0-00000000097f 11762 1726853297.51269: variable 'ansible_search_path' from source: unknown 11762 1726853297.51274: variable 'ansible_search_path' from source: unknown 11762 1726853297.51304: calling self._execute() 11762 1726853297.51379: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853297.51384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853297.51393: variable 'omit' from source: magic vars 11762 1726853297.51661: variable 'ansible_distribution_major_version' from source: facts 11762 1726853297.51673: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853297.51678: variable 'omit' from source: magic vars 11762 1726853297.51720: variable 'omit' from source: magic vars 11762 1726853297.51788: variable 'interface' from source: task vars 11762 1726853297.51792: variable 'dhcp_interface1' from source: play vars 11762 1726853297.51836: variable 'dhcp_interface1' from source: play vars 11762 1726853297.51851: variable 'omit' from source: magic vars 11762 1726853297.51886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853297.51914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853297.51930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853297.51946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853297.51953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853297.51980: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853297.51984: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853297.51986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853297.52052: Set connection var ansible_timeout to 10 11762 1726853297.52055: Set connection var ansible_shell_type to sh 11762 1726853297.52059: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853297.52064: Set connection var ansible_shell_executable to /bin/sh 11762 1726853297.52072: Set connection var ansible_pipelining to False 11762 1726853297.52079: Set connection var ansible_connection to ssh 11762 1726853297.52099: variable 'ansible_shell_executable' from source: unknown 11762 1726853297.52102: variable 'ansible_connection' from source: unknown 11762 1726853297.52104: variable 'ansible_module_compression' from source: unknown 11762 1726853297.52107: variable 'ansible_shell_type' from source: unknown 11762 1726853297.52109: variable 'ansible_shell_executable' from source: unknown 11762 1726853297.52111: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853297.52118: variable 'ansible_pipelining' from source: unknown 11762 1726853297.52121: variable 'ansible_timeout' from source: unknown 11762 1726853297.52124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853297.52229: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853297.52235: variable 'omit' from source: magic vars 11762 1726853297.52240: starting attempt loop 11762 1726853297.52245: running the handler 11762 1726853297.52336: variable 'interface_stat' from source: set_fact 11762 1726853297.52349: Evaluated conditional (interface_stat.stat.exists): True 11762 1726853297.52354: handler run complete 11762 1726853297.52365: attempt loop complete, returning result 11762 1726853297.52368: _execute() done 11762 1726853297.52372: dumping result to json 11762 1726853297.52375: done dumping result, returning 11762 1726853297.52381: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' [02083763-bbaf-d845-03d0-00000000097f] 11762 1726853297.52386: sending task result for task 02083763-bbaf-d845-03d0-00000000097f 11762 1726853297.52468: done sending task result for task 02083763-bbaf-d845-03d0-00000000097f 11762 1726853297.52470: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853297.52515: no more pending results, returning what we have 11762 1726853297.52519: results queue empty 11762 1726853297.52520: checking for any_errors_fatal 11762 1726853297.52529: done checking for any_errors_fatal 11762 1726853297.52529: checking for max_fail_percentage 11762 1726853297.52531: done checking for max_fail_percentage 11762 1726853297.52532: checking to see if all hosts have failed and the running result is not ok 11762 1726853297.52533: done checking to see if all hosts have failed 11762 1726853297.52533: getting the remaining hosts for this loop 11762 1726853297.52535: done getting the remaining hosts for this loop 11762 1726853297.52538: getting the next task for host managed_node2 11762 1726853297.52546: done getting next task for host managed_node2 11762 1726853297.52548: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11762 1726853297.52553: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853297.52556: getting variables 11762 1726853297.52558: in VariableManager get_vars() 11762 1726853297.52595: Calling all_inventory to load vars for managed_node2 11762 1726853297.52597: Calling groups_inventory to load vars for managed_node2 11762 1726853297.52599: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853297.52608: Calling all_plugins_play to load vars for managed_node2 11762 1726853297.52610: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853297.52613: Calling groups_plugins_play to load vars for managed_node2 11762 1726853297.54168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853297.55757: done with get_vars() 11762 1726853297.55783: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:28:17 -0400 (0:00:00.051) 0:00:47.989 ****** 11762 1726853297.55901: entering _queue_task() for managed_node2/include_tasks 11762 1726853297.56263: worker is 1 (out of 1 available) 11762 1726853297.56289: exiting _queue_task() for managed_node2/include_tasks 11762 1726853297.56302: done queuing things up, now waiting for results queue to drain 11762 1726853297.56304: waiting for pending results... 11762 1726853297.56508: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11762 1726853297.56591: in run() - task 02083763-bbaf-d845-03d0-000000000983 11762 1726853297.56607: variable 'ansible_search_path' from source: unknown 11762 1726853297.56610: variable 'ansible_search_path' from source: unknown 11762 1726853297.56649: calling self._execute() 11762 1726853297.56731: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853297.56735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853297.56744: variable 'omit' from source: magic vars 11762 1726853297.57024: variable 'ansible_distribution_major_version' from source: facts 11762 1726853297.57035: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853297.57039: _execute() done 11762 1726853297.57042: dumping result to json 11762 1726853297.57054: done dumping result, returning 11762 1726853297.57057: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-d845-03d0-000000000983] 11762 1726853297.57060: sending task result for task 02083763-bbaf-d845-03d0-000000000983 11762 1726853297.57148: done sending task result for task 02083763-bbaf-d845-03d0-000000000983 11762 1726853297.57150: WORKER PROCESS EXITING 11762 1726853297.57195: no more pending results, returning what we have 11762 1726853297.57200: in VariableManager get_vars() 11762 1726853297.57241: Calling all_inventory to load vars for managed_node2 11762 1726853297.57245: Calling groups_inventory to load vars for managed_node2 11762 1726853297.57247: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853297.57258: Calling all_plugins_play to load vars for managed_node2 11762 1726853297.57269: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853297.57275: Calling groups_plugins_play to load vars for managed_node2 11762 1726853297.58034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853297.58994: done with get_vars() 11762 1726853297.59006: variable 'ansible_search_path' from source: unknown 11762 1726853297.59007: variable 'ansible_search_path' from source: unknown 11762 1726853297.59039: we have included files to process 11762 1726853297.59041: generating all_blocks data 11762 1726853297.59046: done generating all_blocks data 11762 1726853297.59049: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853297.59050: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853297.59052: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11762 1726853297.59222: done processing included file 11762 1726853297.59224: iterating over new_blocks loaded from include file 11762 1726853297.59225: in VariableManager get_vars() 11762 1726853297.59246: done with get_vars() 11762 1726853297.59248: filtering new block on tags 11762 1726853297.59278: done filtering new block on tags 11762 1726853297.59281: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11762 1726853297.59287: extending task lists for all hosts with included blocks 11762 1726853297.59495: done extending task lists 11762 1726853297.59496: done processing included files 11762 1726853297.59497: results queue empty 11762 1726853297.59498: checking for any_errors_fatal 11762 1726853297.59501: done checking for any_errors_fatal 11762 1726853297.59502: checking for max_fail_percentage 11762 1726853297.59503: done checking for max_fail_percentage 11762 1726853297.59504: checking to see if all hosts have failed and the running result is not ok 11762 1726853297.59505: done checking to see if all hosts have failed 11762 1726853297.59505: getting the remaining hosts for this loop 11762 1726853297.59506: done getting the remaining hosts for this loop 11762 1726853297.59508: getting the next task for host managed_node2 11762 1726853297.59513: done getting next task for host managed_node2 11762 1726853297.59515: ^ task is: TASK: Get stat for interface {{ interface }} 11762 1726853297.59518: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853297.59520: getting variables 11762 1726853297.59521: in VariableManager get_vars() 11762 1726853297.59531: Calling all_inventory to load vars for managed_node2 11762 1726853297.59533: Calling groups_inventory to load vars for managed_node2 11762 1726853297.59535: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853297.59540: Calling all_plugins_play to load vars for managed_node2 11762 1726853297.59544: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853297.59547: Calling groups_plugins_play to load vars for managed_node2 11762 1726853297.60453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853297.61360: done with get_vars() 11762 1726853297.61376: done getting variables 11762 1726853297.61492: variable 'interface' from source: task vars 11762 1726853297.61496: variable 'dhcp_interface2' from source: play vars 11762 1726853297.61535: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:28:17 -0400 (0:00:00.056) 0:00:48.046 ****** 11762 1726853297.61564: entering _queue_task() for managed_node2/stat 11762 1726853297.61809: worker is 1 (out of 1 available) 11762 1726853297.61825: exiting _queue_task() for managed_node2/stat 11762 1726853297.61839: done queuing things up, now waiting for results queue to drain 11762 1726853297.61841: waiting for pending results... 11762 1726853297.62024: running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 11762 1726853297.62120: in run() - task 02083763-bbaf-d845-03d0-000000000a01 11762 1726853297.62131: variable 'ansible_search_path' from source: unknown 11762 1726853297.62134: variable 'ansible_search_path' from source: unknown 11762 1726853297.62162: calling self._execute() 11762 1726853297.62236: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853297.62240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853297.62249: variable 'omit' from source: magic vars 11762 1726853297.62515: variable 'ansible_distribution_major_version' from source: facts 11762 1726853297.62525: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853297.62531: variable 'omit' from source: magic vars 11762 1726853297.62575: variable 'omit' from source: magic vars 11762 1726853297.62676: variable 'interface' from source: task vars 11762 1726853297.62876: variable 'dhcp_interface2' from source: play vars 11762 1726853297.62880: variable 'dhcp_interface2' from source: play vars 11762 1726853297.62882: variable 'omit' from source: magic vars 11762 1726853297.62885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853297.62887: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853297.62889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853297.62891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853297.62893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853297.62932: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853297.62940: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853297.62950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853297.63067: Set connection var ansible_timeout to 10 11762 1726853297.63079: Set connection var ansible_shell_type to sh 11762 1726853297.63090: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853297.63100: Set connection var ansible_shell_executable to /bin/sh 11762 1726853297.63120: Set connection var ansible_pipelining to False 11762 1726853297.63134: Set connection var ansible_connection to ssh 11762 1726853297.63162: variable 'ansible_shell_executable' from source: unknown 11762 1726853297.63172: variable 'ansible_connection' from source: unknown 11762 1726853297.63229: variable 'ansible_module_compression' from source: unknown 11762 1726853297.63232: variable 'ansible_shell_type' from source: unknown 11762 1726853297.63234: variable 'ansible_shell_executable' from source: unknown 11762 1726853297.63235: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853297.63237: variable 'ansible_pipelining' from source: unknown 11762 1726853297.63240: variable 'ansible_timeout' from source: unknown 11762 1726853297.63241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853297.63663: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853297.63668: variable 'omit' from source: magic vars 11762 1726853297.63670: starting attempt loop 11762 1726853297.63675: running the handler 11762 1726853297.63677: _low_level_execute_command(): starting 11762 1726853297.63680: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853297.65010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.65123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.65302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.65412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.67152: stdout chunk (state=3): >>>/root <<< 11762 1726853297.67290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853297.67314: stderr chunk (state=3): >>><<< 11762 1726853297.67327: stdout chunk (state=3): >>><<< 11762 1726853297.67365: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853297.67406: _low_level_execute_command(): starting 11762 1726853297.67498: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326 `" && echo ansible-tmp-1726853297.6737502-14100-29948129443326="` echo /root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326 `" ) && sleep 0' 11762 1726853297.68362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853297.68591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.68609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.68712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.70776: stdout chunk (state=3): >>>ansible-tmp-1726853297.6737502-14100-29948129443326=/root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326 <<< 11762 1726853297.70882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853297.70986: stderr chunk (state=3): >>><<< 11762 1726853297.70996: stdout chunk (state=3): >>><<< 11762 1726853297.71022: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853297.6737502-14100-29948129443326=/root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853297.71194: variable 'ansible_module_compression' from source: unknown 11762 1726853297.71260: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11762 1726853297.71519: variable 'ansible_facts' from source: unknown 11762 1726853297.71586: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326/AnsiballZ_stat.py 11762 1726853297.72028: Sending initial data 11762 1726853297.72038: Sent initial data (152 bytes) 11762 1726853297.73304: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853297.73411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.73474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853297.73522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.73562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.73721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.75523: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853297.75599: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853297.75650: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpzlwzoguc /root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326/AnsiballZ_stat.py <<< 11762 1726853297.75654: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326/AnsiballZ_stat.py" <<< 11762 1726853297.75731: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11762 1726853297.75740: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpzlwzoguc" to remote "/root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326/AnsiballZ_stat.py" <<< 11762 1726853297.75748: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326/AnsiballZ_stat.py" <<< 11762 1726853297.76936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853297.77097: stderr chunk (state=3): >>><<< 11762 1726853297.77101: stdout chunk (state=3): >>><<< 11762 1726853297.77103: done transferring module to remote 11762 1726853297.77106: _low_level_execute_command(): starting 11762 1726853297.77108: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326/ /root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326/AnsiballZ_stat.py && sleep 0' 11762 1726853297.77652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853297.77668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853297.77782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.77797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.77899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.79810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853297.79869: stderr chunk (state=3): >>><<< 11762 1726853297.79881: stdout chunk (state=3): >>><<< 11762 1726853297.79906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853297.79914: _low_level_execute_command(): starting 11762 1726853297.79926: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326/AnsiballZ_stat.py && sleep 0' 11762 1726853297.80566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853297.80584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853297.80597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853297.80613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853297.80636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853297.80651: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853297.80668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.80747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.80782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853297.80802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853297.80824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.80933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853297.96535: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28187, "dev": 23, "nlink": 1, "atime": 1726853295.5382504, "mtime": 1726853295.5382504, "ctime": 1726853295.5382504, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11762 1726853297.97910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853297.97932: stderr chunk (state=3): >>><<< 11762 1726853297.97936: stdout chunk (state=3): >>><<< 11762 1726853297.97958: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28187, "dev": 23, "nlink": 1, "atime": 1726853295.5382504, "mtime": 1726853295.5382504, "ctime": 1726853295.5382504, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853297.97998: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853297.98005: _low_level_execute_command(): starting 11762 1726853297.98010: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853297.6737502-14100-29948129443326/ > /dev/null 2>&1 && sleep 0' 11762 1726853297.98462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853297.98466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.98476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853297.98478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853297.98481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853297.98527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853297.98533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853297.98599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853298.00504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853298.00528: stderr chunk (state=3): >>><<< 11762 1726853298.00532: stdout chunk (state=3): >>><<< 11762 1726853298.00549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853298.00555: handler run complete 11762 1726853298.00588: attempt loop complete, returning result 11762 1726853298.00591: _execute() done 11762 1726853298.00593: dumping result to json 11762 1726853298.00598: done dumping result, returning 11762 1726853298.00605: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 [02083763-bbaf-d845-03d0-000000000a01] 11762 1726853298.00610: sending task result for task 02083763-bbaf-d845-03d0-000000000a01 11762 1726853298.00718: done sending task result for task 02083763-bbaf-d845-03d0-000000000a01 11762 1726853298.00723: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726853295.5382504, "block_size": 4096, "blocks": 0, "ctime": 1726853295.5382504, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28187, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726853295.5382504, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11762 1726853298.00825: no more pending results, returning what we have 11762 1726853298.00829: results queue empty 11762 1726853298.00830: checking for any_errors_fatal 11762 1726853298.00833: done checking for any_errors_fatal 11762 1726853298.00834: checking for max_fail_percentage 11762 1726853298.00836: done checking for max_fail_percentage 11762 1726853298.00837: checking to see if all hosts have failed and the running result is not ok 11762 1726853298.00837: done checking to see if all hosts have failed 11762 1726853298.00838: getting the remaining hosts for this loop 11762 1726853298.00840: done getting the remaining hosts for this loop 11762 1726853298.00843: getting the next task for host managed_node2 11762 1726853298.00853: done getting next task for host managed_node2 11762 1726853298.00855: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11762 1726853298.00859: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853298.00862: getting variables 11762 1726853298.00863: in VariableManager get_vars() 11762 1726853298.00899: Calling all_inventory to load vars for managed_node2 11762 1726853298.00902: Calling groups_inventory to load vars for managed_node2 11762 1726853298.00904: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853298.00913: Calling all_plugins_play to load vars for managed_node2 11762 1726853298.00915: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853298.00918: Calling groups_plugins_play to load vars for managed_node2 11762 1726853298.02640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853298.05016: done with get_vars() 11762 1726853298.05041: done getting variables 11762 1726853298.05106: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853298.05223: variable 'interface' from source: task vars 11762 1726853298.05227: variable 'dhcp_interface2' from source: play vars 11762 1726853298.05290: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:28:18 -0400 (0:00:00.437) 0:00:48.483 ****** 11762 1726853298.05323: entering _queue_task() for managed_node2/assert 11762 1726853298.05900: worker is 1 (out of 1 available) 11762 1726853298.05912: exiting _queue_task() for managed_node2/assert 11762 1726853298.05922: done queuing things up, now waiting for results queue to drain 11762 1726853298.05924: waiting for pending results... 11762 1726853298.06195: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' 11762 1726853298.06201: in run() - task 02083763-bbaf-d845-03d0-000000000984 11762 1726853298.06206: variable 'ansible_search_path' from source: unknown 11762 1726853298.06209: variable 'ansible_search_path' from source: unknown 11762 1726853298.06213: calling self._execute() 11762 1726853298.06304: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.06307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.06318: variable 'omit' from source: magic vars 11762 1726853298.06704: variable 'ansible_distribution_major_version' from source: facts 11762 1726853298.06724: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853298.06728: variable 'omit' from source: magic vars 11762 1726853298.06889: variable 'omit' from source: magic vars 11762 1726853298.06893: variable 'interface' from source: task vars 11762 1726853298.06895: variable 'dhcp_interface2' from source: play vars 11762 1726853298.07156: variable 'dhcp_interface2' from source: play vars 11762 1726853298.07159: variable 'omit' from source: magic vars 11762 1726853298.07162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853298.07164: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853298.07167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853298.07169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853298.07173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853298.07176: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853298.07178: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.07180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.07259: Set connection var ansible_timeout to 10 11762 1726853298.07263: Set connection var ansible_shell_type to sh 11762 1726853298.07265: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853298.07267: Set connection var ansible_shell_executable to /bin/sh 11762 1726853298.07270: Set connection var ansible_pipelining to False 11762 1726853298.07274: Set connection var ansible_connection to ssh 11762 1726853298.07459: variable 'ansible_shell_executable' from source: unknown 11762 1726853298.07462: variable 'ansible_connection' from source: unknown 11762 1726853298.07466: variable 'ansible_module_compression' from source: unknown 11762 1726853298.07468: variable 'ansible_shell_type' from source: unknown 11762 1726853298.07472: variable 'ansible_shell_executable' from source: unknown 11762 1726853298.07474: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.07476: variable 'ansible_pipelining' from source: unknown 11762 1726853298.07479: variable 'ansible_timeout' from source: unknown 11762 1726853298.07481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.07484: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853298.07486: variable 'omit' from source: magic vars 11762 1726853298.07488: starting attempt loop 11762 1726853298.07490: running the handler 11762 1726853298.07604: variable 'interface_stat' from source: set_fact 11762 1726853298.07623: Evaluated conditional (interface_stat.stat.exists): True 11762 1726853298.07628: handler run complete 11762 1726853298.07643: attempt loop complete, returning result 11762 1726853298.07647: _execute() done 11762 1726853298.07651: dumping result to json 11762 1726853298.07653: done dumping result, returning 11762 1726853298.07660: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' [02083763-bbaf-d845-03d0-000000000984] 11762 1726853298.07666: sending task result for task 02083763-bbaf-d845-03d0-000000000984 11762 1726853298.07757: done sending task result for task 02083763-bbaf-d845-03d0-000000000984 11762 1726853298.07760: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11762 1726853298.07827: no more pending results, returning what we have 11762 1726853298.07831: results queue empty 11762 1726853298.07832: checking for any_errors_fatal 11762 1726853298.07845: done checking for any_errors_fatal 11762 1726853298.07846: checking for max_fail_percentage 11762 1726853298.07848: done checking for max_fail_percentage 11762 1726853298.07849: checking to see if all hosts have failed and the running result is not ok 11762 1726853298.07850: done checking to see if all hosts have failed 11762 1726853298.07851: getting the remaining hosts for this loop 11762 1726853298.07853: done getting the remaining hosts for this loop 11762 1726853298.07857: getting the next task for host managed_node2 11762 1726853298.07867: done getting next task for host managed_node2 11762 1726853298.07872: ^ task is: TASK: Test 11762 1726853298.07876: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853298.07882: getting variables 11762 1726853298.07884: in VariableManager get_vars() 11762 1726853298.07927: Calling all_inventory to load vars for managed_node2 11762 1726853298.07930: Calling groups_inventory to load vars for managed_node2 11762 1726853298.07932: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853298.07946: Calling all_plugins_play to load vars for managed_node2 11762 1726853298.07954: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853298.07957: Calling groups_plugins_play to load vars for managed_node2 11762 1726853298.09534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853298.12718: done with get_vars() 11762 1726853298.12753: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 13:28:18 -0400 (0:00:00.075) 0:00:48.558 ****** 11762 1726853298.12850: entering _queue_task() for managed_node2/include_tasks 11762 1726853298.13599: worker is 1 (out of 1 available) 11762 1726853298.13612: exiting _queue_task() for managed_node2/include_tasks 11762 1726853298.13625: done queuing things up, now waiting for results queue to drain 11762 1726853298.13627: waiting for pending results... 11762 1726853298.14179: running TaskExecutor() for managed_node2/TASK: Test 11762 1726853298.14186: in run() - task 02083763-bbaf-d845-03d0-0000000008ee 11762 1726853298.14189: variable 'ansible_search_path' from source: unknown 11762 1726853298.14192: variable 'ansible_search_path' from source: unknown 11762 1726853298.14195: variable 'lsr_test' from source: include params 11762 1726853298.14420: variable 'lsr_test' from source: include params 11762 1726853298.14424: variable 'omit' from source: magic vars 11762 1726853298.14507: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.14519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.14637: variable 'omit' from source: magic vars 11762 1726853298.14790: variable 'ansible_distribution_major_version' from source: facts 11762 1726853298.14799: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853298.14814: variable 'item' from source: unknown 11762 1726853298.14964: variable 'item' from source: unknown 11762 1726853298.14969: variable 'item' from source: unknown 11762 1726853298.14974: variable 'item' from source: unknown 11762 1726853298.15307: dumping result to json 11762 1726853298.15311: done dumping result, returning 11762 1726853298.15314: done running TaskExecutor() for managed_node2/TASK: Test [02083763-bbaf-d845-03d0-0000000008ee] 11762 1726853298.15316: sending task result for task 02083763-bbaf-d845-03d0-0000000008ee 11762 1726853298.15593: no more pending results, returning what we have 11762 1726853298.15597: in VariableManager get_vars() 11762 1726853298.15632: Calling all_inventory to load vars for managed_node2 11762 1726853298.15634: Calling groups_inventory to load vars for managed_node2 11762 1726853298.15636: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853298.15647: Calling all_plugins_play to load vars for managed_node2 11762 1726853298.15650: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853298.15653: Calling groups_plugins_play to load vars for managed_node2 11762 1726853298.16401: done sending task result for task 02083763-bbaf-d845-03d0-0000000008ee 11762 1726853298.16405: WORKER PROCESS EXITING 11762 1726853298.18746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853298.21745: done with get_vars() 11762 1726853298.21776: variable 'ansible_search_path' from source: unknown 11762 1726853298.21778: variable 'ansible_search_path' from source: unknown 11762 1726853298.21823: we have included files to process 11762 1726853298.21824: generating all_blocks data 11762 1726853298.21826: done generating all_blocks data 11762 1726853298.21831: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 11762 1726853298.21832: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 11762 1726853298.21835: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 11762 1726853298.22105: in VariableManager get_vars() 11762 1726853298.22130: done with get_vars() 11762 1726853298.22135: variable 'omit' from source: magic vars 11762 1726853298.22187: variable 'omit' from source: magic vars 11762 1726853298.22245: in VariableManager get_vars() 11762 1726853298.22265: done with get_vars() 11762 1726853298.22292: in VariableManager get_vars() 11762 1726853298.22309: done with get_vars() 11762 1726853298.22351: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11762 1726853298.22537: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11762 1726853298.22621: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11762 1726853298.23014: in VariableManager get_vars() 11762 1726853298.23037: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11762 1726853298.25936: done processing included file 11762 1726853298.25938: iterating over new_blocks loaded from include file 11762 1726853298.25939: in VariableManager get_vars() 11762 1726853298.25977: done with get_vars() 11762 1726853298.25979: filtering new block on tags 11762 1726853298.26299: done filtering new block on tags 11762 1726853298.26303: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml for managed_node2 => (item=tasks/create_bond_profile_reconfigure.yml) 11762 1726853298.26309: extending task lists for all hosts with included blocks 11762 1726853298.27725: done extending task lists 11762 1726853298.27727: done processing included files 11762 1726853298.27728: results queue empty 11762 1726853298.27728: checking for any_errors_fatal 11762 1726853298.27732: done checking for any_errors_fatal 11762 1726853298.27733: checking for max_fail_percentage 11762 1726853298.27734: done checking for max_fail_percentage 11762 1726853298.27735: checking to see if all hosts have failed and the running result is not ok 11762 1726853298.27735: done checking to see if all hosts have failed 11762 1726853298.27736: getting the remaining hosts for this loop 11762 1726853298.27737: done getting the remaining hosts for this loop 11762 1726853298.27740: getting the next task for host managed_node2 11762 1726853298.27744: done getting next task for host managed_node2 11762 1726853298.27747: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11762 1726853298.27751: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853298.27761: getting variables 11762 1726853298.27762: in VariableManager get_vars() 11762 1726853298.27785: Calling all_inventory to load vars for managed_node2 11762 1726853298.27788: Calling groups_inventory to load vars for managed_node2 11762 1726853298.27790: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853298.27797: Calling all_plugins_play to load vars for managed_node2 11762 1726853298.27800: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853298.27802: Calling groups_plugins_play to load vars for managed_node2 11762 1726853298.29247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853298.31791: done with get_vars() 11762 1726853298.31827: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:28:18 -0400 (0:00:00.190) 0:00:48.749 ****** 11762 1726853298.31920: entering _queue_task() for managed_node2/include_tasks 11762 1726853298.32882: worker is 1 (out of 1 available) 11762 1726853298.32904: exiting _queue_task() for managed_node2/include_tasks 11762 1726853298.32916: done queuing things up, now waiting for results queue to drain 11762 1726853298.32917: waiting for pending results... 11762 1726853298.33254: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11762 1726853298.33614: in run() - task 02083763-bbaf-d845-03d0-000000000a2e 11762 1726853298.33636: variable 'ansible_search_path' from source: unknown 11762 1726853298.33645: variable 'ansible_search_path' from source: unknown 11762 1726853298.33773: calling self._execute() 11762 1726853298.33918: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.33939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.33981: variable 'omit' from source: magic vars 11762 1726853298.34363: variable 'ansible_distribution_major_version' from source: facts 11762 1726853298.34437: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853298.34440: _execute() done 11762 1726853298.34442: dumping result to json 11762 1726853298.34444: done dumping result, returning 11762 1726853298.34446: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-d845-03d0-000000000a2e] 11762 1726853298.34448: sending task result for task 02083763-bbaf-d845-03d0-000000000a2e 11762 1726853298.34777: done sending task result for task 02083763-bbaf-d845-03d0-000000000a2e 11762 1726853298.34781: WORKER PROCESS EXITING 11762 1726853298.34935: no more pending results, returning what we have 11762 1726853298.34940: in VariableManager get_vars() 11762 1726853298.35047: Calling all_inventory to load vars for managed_node2 11762 1726853298.35050: Calling groups_inventory to load vars for managed_node2 11762 1726853298.35052: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853298.35062: Calling all_plugins_play to load vars for managed_node2 11762 1726853298.35066: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853298.35068: Calling groups_plugins_play to load vars for managed_node2 11762 1726853298.36463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853298.38197: done with get_vars() 11762 1726853298.38222: variable 'ansible_search_path' from source: unknown 11762 1726853298.38335: variable 'ansible_search_path' from source: unknown 11762 1726853298.38410: we have included files to process 11762 1726853298.38411: generating all_blocks data 11762 1726853298.38413: done generating all_blocks data 11762 1726853298.38415: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853298.38416: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853298.38419: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853298.39637: done processing included file 11762 1726853298.39640: iterating over new_blocks loaded from include file 11762 1726853298.39641: in VariableManager get_vars() 11762 1726853298.39670: done with get_vars() 11762 1726853298.39777: filtering new block on tags 11762 1726853298.39812: done filtering new block on tags 11762 1726853298.39815: in VariableManager get_vars() 11762 1726853298.39956: done with get_vars() 11762 1726853298.39958: filtering new block on tags 11762 1726853298.40006: done filtering new block on tags 11762 1726853298.40009: in VariableManager get_vars() 11762 1726853298.40035: done with get_vars() 11762 1726853298.40037: filtering new block on tags 11762 1726853298.40191: done filtering new block on tags 11762 1726853298.40194: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 11762 1726853298.40199: extending task lists for all hosts with included blocks 11762 1726853298.43673: done extending task lists 11762 1726853298.43775: done processing included files 11762 1726853298.43777: results queue empty 11762 1726853298.43778: checking for any_errors_fatal 11762 1726853298.43783: done checking for any_errors_fatal 11762 1726853298.43784: checking for max_fail_percentage 11762 1726853298.43785: done checking for max_fail_percentage 11762 1726853298.43786: checking to see if all hosts have failed and the running result is not ok 11762 1726853298.43787: done checking to see if all hosts have failed 11762 1726853298.43788: getting the remaining hosts for this loop 11762 1726853298.43789: done getting the remaining hosts for this loop 11762 1726853298.43792: getting the next task for host managed_node2 11762 1726853298.43802: done getting next task for host managed_node2 11762 1726853298.43804: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11762 1726853298.43809: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853298.43820: getting variables 11762 1726853298.43821: in VariableManager get_vars() 11762 1726853298.43841: Calling all_inventory to load vars for managed_node2 11762 1726853298.43843: Calling groups_inventory to load vars for managed_node2 11762 1726853298.43846: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853298.43852: Calling all_plugins_play to load vars for managed_node2 11762 1726853298.43854: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853298.43857: Calling groups_plugins_play to load vars for managed_node2 11762 1726853298.44623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853298.45476: done with get_vars() 11762 1726853298.45491: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:28:18 -0400 (0:00:00.136) 0:00:48.885 ****** 11762 1726853298.45548: entering _queue_task() for managed_node2/setup 11762 1726853298.45850: worker is 1 (out of 1 available) 11762 1726853298.45863: exiting _queue_task() for managed_node2/setup 11762 1726853298.45878: done queuing things up, now waiting for results queue to drain 11762 1726853298.45880: waiting for pending results... 11762 1726853298.46106: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11762 1726853298.46240: in run() - task 02083763-bbaf-d845-03d0-000000000b10 11762 1726853298.46257: variable 'ansible_search_path' from source: unknown 11762 1726853298.46261: variable 'ansible_search_path' from source: unknown 11762 1726853298.46304: calling self._execute() 11762 1726853298.46395: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.46399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.46412: variable 'omit' from source: magic vars 11762 1726853298.46815: variable 'ansible_distribution_major_version' from source: facts 11762 1726853298.46819: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853298.46999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853298.48809: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853298.48854: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853298.48888: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853298.48910: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853298.48929: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853298.48989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853298.49011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853298.49030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853298.49057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853298.49068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853298.49110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853298.49126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853298.49145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853298.49167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853298.49180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853298.49285: variable '__network_required_facts' from source: role '' defaults 11762 1726853298.49293: variable 'ansible_facts' from source: unknown 11762 1726853298.49739: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11762 1726853298.49746: when evaluation is False, skipping this task 11762 1726853298.49749: _execute() done 11762 1726853298.49751: dumping result to json 11762 1726853298.49756: done dumping result, returning 11762 1726853298.49759: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-d845-03d0-000000000b10] 11762 1726853298.49761: sending task result for task 02083763-bbaf-d845-03d0-000000000b10 11762 1726853298.49849: done sending task result for task 02083763-bbaf-d845-03d0-000000000b10 11762 1726853298.49852: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853298.49912: no more pending results, returning what we have 11762 1726853298.49916: results queue empty 11762 1726853298.49917: checking for any_errors_fatal 11762 1726853298.49918: done checking for any_errors_fatal 11762 1726853298.49919: checking for max_fail_percentage 11762 1726853298.49921: done checking for max_fail_percentage 11762 1726853298.49922: checking to see if all hosts have failed and the running result is not ok 11762 1726853298.49922: done checking to see if all hosts have failed 11762 1726853298.49923: getting the remaining hosts for this loop 11762 1726853298.49925: done getting the remaining hosts for this loop 11762 1726853298.49928: getting the next task for host managed_node2 11762 1726853298.49940: done getting next task for host managed_node2 11762 1726853298.49945: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11762 1726853298.49950: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853298.49972: getting variables 11762 1726853298.49974: in VariableManager get_vars() 11762 1726853298.50016: Calling all_inventory to load vars for managed_node2 11762 1726853298.50019: Calling groups_inventory to load vars for managed_node2 11762 1726853298.50020: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853298.50029: Calling all_plugins_play to load vars for managed_node2 11762 1726853298.50032: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853298.50040: Calling groups_plugins_play to load vars for managed_node2 11762 1726853298.51301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853298.52279: done with get_vars() 11762 1726853298.52295: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:28:18 -0400 (0:00:00.068) 0:00:48.954 ****** 11762 1726853298.52369: entering _queue_task() for managed_node2/stat 11762 1726853298.52613: worker is 1 (out of 1 available) 11762 1726853298.52630: exiting _queue_task() for managed_node2/stat 11762 1726853298.52644: done queuing things up, now waiting for results queue to drain 11762 1726853298.52645: waiting for pending results... 11762 1726853298.52833: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 11762 1726853298.52931: in run() - task 02083763-bbaf-d845-03d0-000000000b12 11762 1726853298.52942: variable 'ansible_search_path' from source: unknown 11762 1726853298.52946: variable 'ansible_search_path' from source: unknown 11762 1726853298.52977: calling self._execute() 11762 1726853298.53056: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.53060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.53068: variable 'omit' from source: magic vars 11762 1726853298.53351: variable 'ansible_distribution_major_version' from source: facts 11762 1726853298.53360: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853298.53476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853298.53677: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853298.53709: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853298.53735: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853298.53763: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853298.53825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853298.53842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853298.53866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853298.53884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853298.53944: variable '__network_is_ostree' from source: set_fact 11762 1726853298.53952: Evaluated conditional (not __network_is_ostree is defined): False 11762 1726853298.53957: when evaluation is False, skipping this task 11762 1726853298.53959: _execute() done 11762 1726853298.53961: dumping result to json 11762 1726853298.53964: done dumping result, returning 11762 1726853298.53975: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-d845-03d0-000000000b12] 11762 1726853298.53978: sending task result for task 02083763-bbaf-d845-03d0-000000000b12 11762 1726853298.54055: done sending task result for task 02083763-bbaf-d845-03d0-000000000b12 11762 1726853298.54058: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11762 1726853298.54125: no more pending results, returning what we have 11762 1726853298.54129: results queue empty 11762 1726853298.54129: checking for any_errors_fatal 11762 1726853298.54138: done checking for any_errors_fatal 11762 1726853298.54138: checking for max_fail_percentage 11762 1726853298.54140: done checking for max_fail_percentage 11762 1726853298.54141: checking to see if all hosts have failed and the running result is not ok 11762 1726853298.54142: done checking to see if all hosts have failed 11762 1726853298.54143: getting the remaining hosts for this loop 11762 1726853298.54145: done getting the remaining hosts for this loop 11762 1726853298.54148: getting the next task for host managed_node2 11762 1726853298.54154: done getting next task for host managed_node2 11762 1726853298.54157: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11762 1726853298.54162: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853298.54190: getting variables 11762 1726853298.54192: in VariableManager get_vars() 11762 1726853298.54227: Calling all_inventory to load vars for managed_node2 11762 1726853298.54230: Calling groups_inventory to load vars for managed_node2 11762 1726853298.54231: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853298.54239: Calling all_plugins_play to load vars for managed_node2 11762 1726853298.54242: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853298.54244: Calling groups_plugins_play to load vars for managed_node2 11762 1726853298.54983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853298.55849: done with get_vars() 11762 1726853298.55864: done getting variables 11762 1726853298.55906: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:28:18 -0400 (0:00:00.035) 0:00:48.989 ****** 11762 1726853298.55938: entering _queue_task() for managed_node2/set_fact 11762 1726853298.56169: worker is 1 (out of 1 available) 11762 1726853298.56184: exiting _queue_task() for managed_node2/set_fact 11762 1726853298.56198: done queuing things up, now waiting for results queue to drain 11762 1726853298.56200: waiting for pending results... 11762 1726853298.56389: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11762 1726853298.56489: in run() - task 02083763-bbaf-d845-03d0-000000000b13 11762 1726853298.56502: variable 'ansible_search_path' from source: unknown 11762 1726853298.56507: variable 'ansible_search_path' from source: unknown 11762 1726853298.56537: calling self._execute() 11762 1726853298.56609: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.56616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.56625: variable 'omit' from source: magic vars 11762 1726853298.56899: variable 'ansible_distribution_major_version' from source: facts 11762 1726853298.56908: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853298.57023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853298.57223: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853298.57256: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853298.57283: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853298.57310: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853298.57374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853298.57391: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853298.57412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853298.57431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853298.57493: variable '__network_is_ostree' from source: set_fact 11762 1726853298.57500: Evaluated conditional (not __network_is_ostree is defined): False 11762 1726853298.57502: when evaluation is False, skipping this task 11762 1726853298.57506: _execute() done 11762 1726853298.57509: dumping result to json 11762 1726853298.57512: done dumping result, returning 11762 1726853298.57522: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-d845-03d0-000000000b13] 11762 1726853298.57525: sending task result for task 02083763-bbaf-d845-03d0-000000000b13 11762 1726853298.57602: done sending task result for task 02083763-bbaf-d845-03d0-000000000b13 11762 1726853298.57605: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11762 1726853298.57667: no more pending results, returning what we have 11762 1726853298.57673: results queue empty 11762 1726853298.57674: checking for any_errors_fatal 11762 1726853298.57680: done checking for any_errors_fatal 11762 1726853298.57680: checking for max_fail_percentage 11762 1726853298.57682: done checking for max_fail_percentage 11762 1726853298.57683: checking to see if all hosts have failed and the running result is not ok 11762 1726853298.57684: done checking to see if all hosts have failed 11762 1726853298.57685: getting the remaining hosts for this loop 11762 1726853298.57686: done getting the remaining hosts for this loop 11762 1726853298.57690: getting the next task for host managed_node2 11762 1726853298.57699: done getting next task for host managed_node2 11762 1726853298.57703: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11762 1726853298.57707: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853298.57724: getting variables 11762 1726853298.57725: in VariableManager get_vars() 11762 1726853298.57761: Calling all_inventory to load vars for managed_node2 11762 1726853298.57763: Calling groups_inventory to load vars for managed_node2 11762 1726853298.57765: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853298.57774: Calling all_plugins_play to load vars for managed_node2 11762 1726853298.57777: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853298.57779: Calling groups_plugins_play to load vars for managed_node2 11762 1726853298.58655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853298.59502: done with get_vars() 11762 1726853298.59516: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:28:18 -0400 (0:00:00.036) 0:00:49.026 ****** 11762 1726853298.59586: entering _queue_task() for managed_node2/service_facts 11762 1726853298.59806: worker is 1 (out of 1 available) 11762 1726853298.59823: exiting _queue_task() for managed_node2/service_facts 11762 1726853298.59837: done queuing things up, now waiting for results queue to drain 11762 1726853298.59839: waiting for pending results... 11762 1726853298.60022: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 11762 1726853298.60128: in run() - task 02083763-bbaf-d845-03d0-000000000b15 11762 1726853298.60139: variable 'ansible_search_path' from source: unknown 11762 1726853298.60143: variable 'ansible_search_path' from source: unknown 11762 1726853298.60175: calling self._execute() 11762 1726853298.60246: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.60253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.60261: variable 'omit' from source: magic vars 11762 1726853298.60527: variable 'ansible_distribution_major_version' from source: facts 11762 1726853298.60537: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853298.60543: variable 'omit' from source: magic vars 11762 1726853298.60602: variable 'omit' from source: magic vars 11762 1726853298.60628: variable 'omit' from source: magic vars 11762 1726853298.60661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853298.60690: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853298.60703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853298.60719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853298.60729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853298.60755: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853298.60758: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.60761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.60830: Set connection var ansible_timeout to 10 11762 1726853298.60833: Set connection var ansible_shell_type to sh 11762 1726853298.60836: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853298.60844: Set connection var ansible_shell_executable to /bin/sh 11762 1726853298.60851: Set connection var ansible_pipelining to False 11762 1726853298.60857: Set connection var ansible_connection to ssh 11762 1726853298.60874: variable 'ansible_shell_executable' from source: unknown 11762 1726853298.60877: variable 'ansible_connection' from source: unknown 11762 1726853298.60880: variable 'ansible_module_compression' from source: unknown 11762 1726853298.60882: variable 'ansible_shell_type' from source: unknown 11762 1726853298.60884: variable 'ansible_shell_executable' from source: unknown 11762 1726853298.60886: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853298.60892: variable 'ansible_pipelining' from source: unknown 11762 1726853298.60894: variable 'ansible_timeout' from source: unknown 11762 1726853298.60896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853298.61038: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853298.61050: variable 'omit' from source: magic vars 11762 1726853298.61053: starting attempt loop 11762 1726853298.61063: running the handler 11762 1726853298.61070: _low_level_execute_command(): starting 11762 1726853298.61078: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853298.61586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853298.61590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853298.61592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853298.61595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853298.61599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853298.61647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853298.61651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853298.61653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853298.61740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853298.63507: stdout chunk (state=3): >>>/root <<< 11762 1726853298.63610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853298.63638: stderr chunk (state=3): >>><<< 11762 1726853298.63643: stdout chunk (state=3): >>><<< 11762 1726853298.63664: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853298.63678: _low_level_execute_command(): starting 11762 1726853298.63683: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068 `" && echo ansible-tmp-1726853298.636635-14156-203554000058068="` echo /root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068 `" ) && sleep 0' 11762 1726853298.64120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853298.64123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853298.64132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853298.64135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853298.64137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853298.64178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853298.64182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853298.64187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853298.64259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853298.66278: stdout chunk (state=3): >>>ansible-tmp-1726853298.636635-14156-203554000058068=/root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068 <<< 11762 1726853298.66385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853298.66411: stderr chunk (state=3): >>><<< 11762 1726853298.66414: stdout chunk (state=3): >>><<< 11762 1726853298.66428: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853298.636635-14156-203554000058068=/root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853298.66462: variable 'ansible_module_compression' from source: unknown 11762 1726853298.66500: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11762 1726853298.66531: variable 'ansible_facts' from source: unknown 11762 1726853298.66591: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068/AnsiballZ_service_facts.py 11762 1726853298.66684: Sending initial data 11762 1726853298.66688: Sent initial data (161 bytes) 11762 1726853298.67115: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853298.67118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853298.67121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853298.67123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853298.67126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853298.67174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853298.67178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853298.67256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853298.68911: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11762 1726853298.68914: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853298.68981: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853298.69048: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmppofdrc2z /root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068/AnsiballZ_service_facts.py <<< 11762 1726853298.69051: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068/AnsiballZ_service_facts.py" <<< 11762 1726853298.69114: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmppofdrc2z" to remote "/root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068/AnsiballZ_service_facts.py" <<< 11762 1726853298.69779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853298.69814: stderr chunk (state=3): >>><<< 11762 1726853298.69818: stdout chunk (state=3): >>><<< 11762 1726853298.69890: done transferring module to remote 11762 1726853298.69899: _low_level_execute_command(): starting 11762 1726853298.69903: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068/ /root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068/AnsiballZ_service_facts.py && sleep 0' 11762 1726853298.70331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853298.70335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853298.70337: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853298.70345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853298.70347: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853298.70388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853298.70391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853298.70465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853298.72337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853298.72362: stderr chunk (state=3): >>><<< 11762 1726853298.72365: stdout chunk (state=3): >>><<< 11762 1726853298.72380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853298.72383: _low_level_execute_command(): starting 11762 1726853298.72387: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068/AnsiballZ_service_facts.py && sleep 0' 11762 1726853298.72802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853298.72806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853298.72808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853298.72810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853298.72812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853298.72846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853298.72857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853298.72938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853300.36637: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 11762 1726853300.36650: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 11762 1726853300.36657: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 11762 1726853300.36660: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11762 1726853300.38235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853300.38239: stdout chunk (state=3): >>><<< 11762 1726853300.38476: stderr chunk (state=3): >>><<< 11762 1726853300.38484: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853300.39060: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853300.39068: _low_level_execute_command(): starting 11762 1726853300.39081: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853298.636635-14156-203554000058068/ > /dev/null 2>&1 && sleep 0' 11762 1726853300.39710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853300.39739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853300.39786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853300.39842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853300.39856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853300.39895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853300.39965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853300.41934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853300.41938: stdout chunk (state=3): >>><<< 11762 1726853300.41940: stderr chunk (state=3): >>><<< 11762 1726853300.42077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853300.42081: handler run complete 11762 1726853300.42221: variable 'ansible_facts' from source: unknown 11762 1726853300.42364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853300.42655: variable 'ansible_facts' from source: unknown 11762 1726853300.42736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853300.42851: attempt loop complete, returning result 11762 1726853300.42856: _execute() done 11762 1726853300.42859: dumping result to json 11762 1726853300.42897: done dumping result, returning 11762 1726853300.42904: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-d845-03d0-000000000b15] 11762 1726853300.42909: sending task result for task 02083763-bbaf-d845-03d0-000000000b15 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853300.43501: no more pending results, returning what we have 11762 1726853300.43503: results queue empty 11762 1726853300.43504: checking for any_errors_fatal 11762 1726853300.43508: done checking for any_errors_fatal 11762 1726853300.43509: checking for max_fail_percentage 11762 1726853300.43511: done checking for max_fail_percentage 11762 1726853300.43512: checking to see if all hosts have failed and the running result is not ok 11762 1726853300.43512: done checking to see if all hosts have failed 11762 1726853300.43513: getting the remaining hosts for this loop 11762 1726853300.43514: done getting the remaining hosts for this loop 11762 1726853300.43517: getting the next task for host managed_node2 11762 1726853300.43523: done getting next task for host managed_node2 11762 1726853300.43526: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11762 1726853300.43532: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853300.43541: getting variables 11762 1726853300.43543: in VariableManager get_vars() 11762 1726853300.43573: Calling all_inventory to load vars for managed_node2 11762 1726853300.43575: Calling groups_inventory to load vars for managed_node2 11762 1726853300.43577: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853300.43584: Calling all_plugins_play to load vars for managed_node2 11762 1726853300.43585: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853300.43589: Calling groups_plugins_play to load vars for managed_node2 11762 1726853300.44217: done sending task result for task 02083763-bbaf-d845-03d0-000000000b15 11762 1726853300.44221: WORKER PROCESS EXITING 11762 1726853300.44681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853300.45748: done with get_vars() 11762 1726853300.45765: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:28:20 -0400 (0:00:01.862) 0:00:50.888 ****** 11762 1726853300.45837: entering _queue_task() for managed_node2/package_facts 11762 1726853300.46093: worker is 1 (out of 1 available) 11762 1726853300.46108: exiting _queue_task() for managed_node2/package_facts 11762 1726853300.46121: done queuing things up, now waiting for results queue to drain 11762 1726853300.46123: waiting for pending results... 11762 1726853300.46303: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 11762 1726853300.46417: in run() - task 02083763-bbaf-d845-03d0-000000000b16 11762 1726853300.46432: variable 'ansible_search_path' from source: unknown 11762 1726853300.46435: variable 'ansible_search_path' from source: unknown 11762 1726853300.46464: calling self._execute() 11762 1726853300.46536: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853300.46540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853300.46550: variable 'omit' from source: magic vars 11762 1726853300.46825: variable 'ansible_distribution_major_version' from source: facts 11762 1726853300.46834: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853300.46840: variable 'omit' from source: magic vars 11762 1726853300.46911: variable 'omit' from source: magic vars 11762 1726853300.46933: variable 'omit' from source: magic vars 11762 1726853300.46966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853300.46995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853300.47012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853300.47027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853300.47040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853300.47078: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853300.47081: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853300.47083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853300.47225: Set connection var ansible_timeout to 10 11762 1726853300.47229: Set connection var ansible_shell_type to sh 11762 1726853300.47231: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853300.47233: Set connection var ansible_shell_executable to /bin/sh 11762 1726853300.47235: Set connection var ansible_pipelining to False 11762 1726853300.47237: Set connection var ansible_connection to ssh 11762 1726853300.47239: variable 'ansible_shell_executable' from source: unknown 11762 1726853300.47241: variable 'ansible_connection' from source: unknown 11762 1726853300.47247: variable 'ansible_module_compression' from source: unknown 11762 1726853300.47249: variable 'ansible_shell_type' from source: unknown 11762 1726853300.47251: variable 'ansible_shell_executable' from source: unknown 11762 1726853300.47252: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853300.47254: variable 'ansible_pipelining' from source: unknown 11762 1726853300.47256: variable 'ansible_timeout' from source: unknown 11762 1726853300.47258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853300.47440: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853300.47447: variable 'omit' from source: magic vars 11762 1726853300.47450: starting attempt loop 11762 1726853300.47453: running the handler 11762 1726853300.47455: _low_level_execute_command(): starting 11762 1726853300.47508: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853300.48177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853300.48231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853300.48301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853300.50003: stdout chunk (state=3): >>>/root <<< 11762 1726853300.50137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853300.50140: stderr chunk (state=3): >>><<< 11762 1726853300.50147: stdout chunk (state=3): >>><<< 11762 1726853300.50169: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853300.50184: _low_level_execute_command(): starting 11762 1726853300.50193: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347 `" && echo ansible-tmp-1726853300.5016725-14212-77416107904347="` echo /root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347 `" ) && sleep 0' 11762 1726853300.51234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853300.51345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853300.51462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853300.51561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853300.53521: stdout chunk (state=3): >>>ansible-tmp-1726853300.5016725-14212-77416107904347=/root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347 <<< 11762 1726853300.53684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853300.53693: stdout chunk (state=3): >>><<< 11762 1726853300.53764: stderr chunk (state=3): >>><<< 11762 1726853300.53768: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853300.5016725-14212-77416107904347=/root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853300.53804: variable 'ansible_module_compression' from source: unknown 11762 1726853300.53863: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11762 1726853300.53947: variable 'ansible_facts' from source: unknown 11762 1726853300.54164: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347/AnsiballZ_package_facts.py 11762 1726853300.54407: Sending initial data 11762 1726853300.54410: Sent initial data (161 bytes) 11762 1726853300.54856: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853300.54880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853300.54937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853300.54940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853300.54994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853300.55066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853300.56683: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11762 1726853300.56688: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853300.56738: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853300.56819: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp7qr_4bf4 /root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347/AnsiballZ_package_facts.py <<< 11762 1726853300.56823: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347/AnsiballZ_package_facts.py" <<< 11762 1726853300.56895: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp7qr_4bf4" to remote "/root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347/AnsiballZ_package_facts.py" <<< 11762 1726853300.58034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853300.58066: stderr chunk (state=3): >>><<< 11762 1726853300.58069: stdout chunk (state=3): >>><<< 11762 1726853300.58087: done transferring module to remote 11762 1726853300.58097: _low_level_execute_command(): starting 11762 1726853300.58102: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347/ /root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347/AnsiballZ_package_facts.py && sleep 0' 11762 1726853300.58513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853300.58516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853300.58518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853300.58521: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853300.58523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853300.58576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853300.58581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853300.58650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853300.60484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853300.60507: stderr chunk (state=3): >>><<< 11762 1726853300.60511: stdout chunk (state=3): >>><<< 11762 1726853300.60522: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853300.60525: _low_level_execute_command(): starting 11762 1726853300.60529: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347/AnsiballZ_package_facts.py && sleep 0' 11762 1726853300.60946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853300.60950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853300.60952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853300.60955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853300.60957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853300.61002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853300.61005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853300.61087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853301.05773: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11762 1726853301.05798: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 11762 1726853301.05846: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 11762 1726853301.05853: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 11762 1726853301.05878: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 11762 1726853301.05901: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 11762 1726853301.05905: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 11762 1726853301.05940: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11762 1726853301.05946: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 11762 1726853301.06155: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-m<<< 11762 1726853301.06163: stdout chunk (state=3): >>>apper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11762 1726853301.08488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853301.08517: stderr chunk (state=3): >>><<< 11762 1726853301.08520: stdout chunk (state=3): >>><<< 11762 1726853301.08558: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853301.15379: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853301.15384: _low_level_execute_command(): starting 11762 1726853301.15386: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853300.5016725-14212-77416107904347/ > /dev/null 2>&1 && sleep 0' 11762 1726853301.15934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853301.15951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853301.15964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853301.15984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853301.16001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853301.16013: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853301.16027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853301.16048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853301.16133: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853301.16158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853301.16386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853301.18300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853301.18303: stdout chunk (state=3): >>><<< 11762 1726853301.18306: stderr chunk (state=3): >>><<< 11762 1726853301.18318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853301.18329: handler run complete 11762 1726853301.20176: variable 'ansible_facts' from source: unknown 11762 1726853301.20914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853301.23138: variable 'ansible_facts' from source: unknown 11762 1726853301.23821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853301.24544: attempt loop complete, returning result 11762 1726853301.24562: _execute() done 11762 1726853301.24569: dumping result to json 11762 1726853301.24788: done dumping result, returning 11762 1726853301.24802: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-d845-03d0-000000000b16] 11762 1726853301.24811: sending task result for task 02083763-bbaf-d845-03d0-000000000b16 11762 1726853301.33663: done sending task result for task 02083763-bbaf-d845-03d0-000000000b16 11762 1726853301.33666: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853301.33772: no more pending results, returning what we have 11762 1726853301.33775: results queue empty 11762 1726853301.33775: checking for any_errors_fatal 11762 1726853301.33778: done checking for any_errors_fatal 11762 1726853301.33779: checking for max_fail_percentage 11762 1726853301.33780: done checking for max_fail_percentage 11762 1726853301.33781: checking to see if all hosts have failed and the running result is not ok 11762 1726853301.33782: done checking to see if all hosts have failed 11762 1726853301.33782: getting the remaining hosts for this loop 11762 1726853301.33784: done getting the remaining hosts for this loop 11762 1726853301.33786: getting the next task for host managed_node2 11762 1726853301.33791: done getting next task for host managed_node2 11762 1726853301.33794: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11762 1726853301.33799: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853301.33808: getting variables 11762 1726853301.33809: in VariableManager get_vars() 11762 1726853301.33830: Calling all_inventory to load vars for managed_node2 11762 1726853301.33833: Calling groups_inventory to load vars for managed_node2 11762 1726853301.33835: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853301.33840: Calling all_plugins_play to load vars for managed_node2 11762 1726853301.33842: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853301.33845: Calling groups_plugins_play to load vars for managed_node2 11762 1726853301.35788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853301.39092: done with get_vars() 11762 1726853301.39230: done getting variables 11762 1726853301.39345: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:28:21 -0400 (0:00:00.935) 0:00:51.824 ****** 11762 1726853301.39379: entering _queue_task() for managed_node2/debug 11762 1726853301.40054: worker is 1 (out of 1 available) 11762 1726853301.40068: exiting _queue_task() for managed_node2/debug 11762 1726853301.40184: done queuing things up, now waiting for results queue to drain 11762 1726853301.40187: waiting for pending results... 11762 1726853301.40566: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 11762 1726853301.41178: in run() - task 02083763-bbaf-d845-03d0-000000000a2f 11762 1726853301.41182: variable 'ansible_search_path' from source: unknown 11762 1726853301.41185: variable 'ansible_search_path' from source: unknown 11762 1726853301.41188: calling self._execute() 11762 1726853301.41190: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853301.41576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853301.41580: variable 'omit' from source: magic vars 11762 1726853301.41868: variable 'ansible_distribution_major_version' from source: facts 11762 1726853301.42376: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853301.42379: variable 'omit' from source: magic vars 11762 1726853301.42382: variable 'omit' from source: magic vars 11762 1726853301.42384: variable 'network_provider' from source: set_fact 11762 1726853301.42387: variable 'omit' from source: magic vars 11762 1726853301.42483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853301.42527: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853301.42557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853301.42587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853301.42605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853301.42645: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853301.42656: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853301.42664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853301.42769: Set connection var ansible_timeout to 10 11762 1726853301.42783: Set connection var ansible_shell_type to sh 11762 1726853301.42799: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853301.42812: Set connection var ansible_shell_executable to /bin/sh 11762 1726853301.42827: Set connection var ansible_pipelining to False 11762 1726853301.42838: Set connection var ansible_connection to ssh 11762 1726853301.42866: variable 'ansible_shell_executable' from source: unknown 11762 1726853301.42876: variable 'ansible_connection' from source: unknown 11762 1726853301.42883: variable 'ansible_module_compression' from source: unknown 11762 1726853301.42890: variable 'ansible_shell_type' from source: unknown 11762 1726853301.42901: variable 'ansible_shell_executable' from source: unknown 11762 1726853301.42908: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853301.42916: variable 'ansible_pipelining' from source: unknown 11762 1726853301.42922: variable 'ansible_timeout' from source: unknown 11762 1726853301.42930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853301.43086: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853301.43105: variable 'omit' from source: magic vars 11762 1726853301.43120: starting attempt loop 11762 1726853301.43129: running the handler 11762 1726853301.43187: handler run complete 11762 1726853301.43207: attempt loop complete, returning result 11762 1726853301.43214: _execute() done 11762 1726853301.43226: dumping result to json 11762 1726853301.43233: done dumping result, returning 11762 1726853301.43248: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-d845-03d0-000000000a2f] 11762 1726853301.43259: sending task result for task 02083763-bbaf-d845-03d0-000000000a2f 11762 1726853301.43374: done sending task result for task 02083763-bbaf-d845-03d0-000000000a2f 11762 1726853301.43383: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 11762 1726853301.43447: no more pending results, returning what we have 11762 1726853301.43452: results queue empty 11762 1726853301.43453: checking for any_errors_fatal 11762 1726853301.43465: done checking for any_errors_fatal 11762 1726853301.43465: checking for max_fail_percentage 11762 1726853301.43467: done checking for max_fail_percentage 11762 1726853301.43468: checking to see if all hosts have failed and the running result is not ok 11762 1726853301.43469: done checking to see if all hosts have failed 11762 1726853301.43469: getting the remaining hosts for this loop 11762 1726853301.43472: done getting the remaining hosts for this loop 11762 1726853301.43475: getting the next task for host managed_node2 11762 1726853301.43483: done getting next task for host managed_node2 11762 1726853301.43487: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11762 1726853301.43492: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853301.43503: getting variables 11762 1726853301.43504: in VariableManager get_vars() 11762 1726853301.43545: Calling all_inventory to load vars for managed_node2 11762 1726853301.43548: Calling groups_inventory to load vars for managed_node2 11762 1726853301.43550: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853301.43559: Calling all_plugins_play to load vars for managed_node2 11762 1726853301.43562: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853301.43564: Calling groups_plugins_play to load vars for managed_node2 11762 1726853301.45230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853301.48454: done with get_vars() 11762 1726853301.48683: done getting variables 11762 1726853301.48747: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:28:21 -0400 (0:00:00.094) 0:00:51.918 ****** 11762 1726853301.48794: entering _queue_task() for managed_node2/fail 11762 1726853301.49530: worker is 1 (out of 1 available) 11762 1726853301.49548: exiting _queue_task() for managed_node2/fail 11762 1726853301.49561: done queuing things up, now waiting for results queue to drain 11762 1726853301.49563: waiting for pending results... 11762 1726853301.50117: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11762 1726853301.50351: in run() - task 02083763-bbaf-d845-03d0-000000000a30 11762 1726853301.50366: variable 'ansible_search_path' from source: unknown 11762 1726853301.50373: variable 'ansible_search_path' from source: unknown 11762 1726853301.50407: calling self._execute() 11762 1726853301.50752: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853301.50756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853301.50759: variable 'omit' from source: magic vars 11762 1726853301.51486: variable 'ansible_distribution_major_version' from source: facts 11762 1726853301.51491: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853301.51735: variable 'network_state' from source: role '' defaults 11762 1726853301.51748: Evaluated conditional (network_state != {}): False 11762 1726853301.51752: when evaluation is False, skipping this task 11762 1726853301.51755: _execute() done 11762 1726853301.51758: dumping result to json 11762 1726853301.51760: done dumping result, returning 11762 1726853301.51764: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-d845-03d0-000000000a30] 11762 1726853301.51769: sending task result for task 02083763-bbaf-d845-03d0-000000000a30 11762 1726853301.52378: done sending task result for task 02083763-bbaf-d845-03d0-000000000a30 11762 1726853301.52383: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853301.52424: no more pending results, returning what we have 11762 1726853301.52427: results queue empty 11762 1726853301.52428: checking for any_errors_fatal 11762 1726853301.52432: done checking for any_errors_fatal 11762 1726853301.52433: checking for max_fail_percentage 11762 1726853301.52434: done checking for max_fail_percentage 11762 1726853301.52435: checking to see if all hosts have failed and the running result is not ok 11762 1726853301.52436: done checking to see if all hosts have failed 11762 1726853301.52437: getting the remaining hosts for this loop 11762 1726853301.52438: done getting the remaining hosts for this loop 11762 1726853301.52441: getting the next task for host managed_node2 11762 1726853301.52449: done getting next task for host managed_node2 11762 1726853301.52452: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11762 1726853301.52457: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853301.52478: getting variables 11762 1726853301.52481: in VariableManager get_vars() 11762 1726853301.52518: Calling all_inventory to load vars for managed_node2 11762 1726853301.52520: Calling groups_inventory to load vars for managed_node2 11762 1726853301.52522: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853301.52530: Calling all_plugins_play to load vars for managed_node2 11762 1726853301.52533: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853301.52536: Calling groups_plugins_play to load vars for managed_node2 11762 1726853301.56739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853301.60578: done with get_vars() 11762 1726853301.60612: done getting variables 11762 1726853301.60889: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:28:21 -0400 (0:00:00.121) 0:00:52.039 ****** 11762 1726853301.60927: entering _queue_task() for managed_node2/fail 11762 1726853301.61946: worker is 1 (out of 1 available) 11762 1726853301.61958: exiting _queue_task() for managed_node2/fail 11762 1726853301.61969: done queuing things up, now waiting for results queue to drain 11762 1726853301.61973: waiting for pending results... 11762 1726853301.62199: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11762 1726853301.62359: in run() - task 02083763-bbaf-d845-03d0-000000000a31 11762 1726853301.62375: variable 'ansible_search_path' from source: unknown 11762 1726853301.62379: variable 'ansible_search_path' from source: unknown 11762 1726853301.62429: calling self._execute() 11762 1726853301.62663: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853301.62668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853301.62673: variable 'omit' from source: magic vars 11762 1726853301.63214: variable 'ansible_distribution_major_version' from source: facts 11762 1726853301.63218: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853301.63244: variable 'network_state' from source: role '' defaults 11762 1726853301.63258: Evaluated conditional (network_state != {}): False 11762 1726853301.63262: when evaluation is False, skipping this task 11762 1726853301.63264: _execute() done 11762 1726853301.63267: dumping result to json 11762 1726853301.63269: done dumping result, returning 11762 1726853301.63279: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-d845-03d0-000000000a31] 11762 1726853301.63290: sending task result for task 02083763-bbaf-d845-03d0-000000000a31 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853301.63447: no more pending results, returning what we have 11762 1726853301.63452: results queue empty 11762 1726853301.63453: checking for any_errors_fatal 11762 1726853301.63462: done checking for any_errors_fatal 11762 1726853301.63463: checking for max_fail_percentage 11762 1726853301.63466: done checking for max_fail_percentage 11762 1726853301.63467: checking to see if all hosts have failed and the running result is not ok 11762 1726853301.63468: done checking to see if all hosts have failed 11762 1726853301.63469: getting the remaining hosts for this loop 11762 1726853301.63473: done getting the remaining hosts for this loop 11762 1726853301.63478: getting the next task for host managed_node2 11762 1726853301.63486: done getting next task for host managed_node2 11762 1726853301.63490: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11762 1726853301.63497: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853301.63523: getting variables 11762 1726853301.63525: in VariableManager get_vars() 11762 1726853301.63594: Calling all_inventory to load vars for managed_node2 11762 1726853301.63597: Calling groups_inventory to load vars for managed_node2 11762 1726853301.63601: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853301.63614: Calling all_plugins_play to load vars for managed_node2 11762 1726853301.63618: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853301.63621: Calling groups_plugins_play to load vars for managed_node2 11762 1726853301.64277: done sending task result for task 02083763-bbaf-d845-03d0-000000000a31 11762 1726853301.64282: WORKER PROCESS EXITING 11762 1726853301.66434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853301.68555: done with get_vars() 11762 1726853301.68590: done getting variables 11762 1726853301.68652: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:28:21 -0400 (0:00:00.077) 0:00:52.117 ****** 11762 1726853301.68696: entering _queue_task() for managed_node2/fail 11762 1726853301.69148: worker is 1 (out of 1 available) 11762 1726853301.69161: exiting _queue_task() for managed_node2/fail 11762 1726853301.69176: done queuing things up, now waiting for results queue to drain 11762 1726853301.69178: waiting for pending results... 11762 1726853301.69387: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11762 1726853301.69567: in run() - task 02083763-bbaf-d845-03d0-000000000a32 11762 1726853301.69589: variable 'ansible_search_path' from source: unknown 11762 1726853301.69597: variable 'ansible_search_path' from source: unknown 11762 1726853301.69642: calling self._execute() 11762 1726853301.69748: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853301.69760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853301.69779: variable 'omit' from source: magic vars 11762 1726853301.70181: variable 'ansible_distribution_major_version' from source: facts 11762 1726853301.70203: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853301.70390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853301.73010: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853301.73089: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853301.73139: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853301.73242: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853301.73245: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853301.73294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853301.73328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853301.73366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853301.73414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853301.73437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853301.73542: variable 'ansible_distribution_major_version' from source: facts 11762 1726853301.73574: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11762 1726853301.73690: variable 'ansible_distribution' from source: facts 11762 1726853301.73785: variable '__network_rh_distros' from source: role '' defaults 11762 1726853301.73788: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11762 1726853301.73953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853301.74003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853301.74032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853301.74480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853301.74484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853301.74486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853301.74489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853301.74491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853301.74493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853301.74503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853301.74549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853301.74582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853301.74725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853301.74768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853301.74817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853301.75598: variable 'network_connections' from source: task vars 11762 1726853301.75616: variable 'controller_profile' from source: play vars 11762 1726853301.75897: variable 'controller_profile' from source: play vars 11762 1726853301.75901: variable 'controller_device' from source: play vars 11762 1726853301.75904: variable 'controller_device' from source: play vars 11762 1726853301.75907: variable 'dhcp_interface1' from source: play vars 11762 1726853301.76057: variable 'dhcp_interface1' from source: play vars 11762 1726853301.76094: variable 'port1_profile' from source: play vars 11762 1726853301.76176: variable 'port1_profile' from source: play vars 11762 1726853301.76454: variable 'dhcp_interface1' from source: play vars 11762 1726853301.76457: variable 'dhcp_interface1' from source: play vars 11762 1726853301.76459: variable 'controller_profile' from source: play vars 11762 1726853301.76583: variable 'controller_profile' from source: play vars 11762 1726853301.76597: variable 'port2_profile' from source: play vars 11762 1726853301.76786: variable 'port2_profile' from source: play vars 11762 1726853301.76810: variable 'dhcp_interface2' from source: play vars 11762 1726853301.76874: variable 'dhcp_interface2' from source: play vars 11762 1726853301.76933: variable 'controller_profile' from source: play vars 11762 1726853301.77065: variable 'controller_profile' from source: play vars 11762 1726853301.77119: variable 'network_state' from source: role '' defaults 11762 1726853301.77326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853301.77703: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853301.77706: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853301.77709: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853301.77721: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853301.77772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853301.77828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853301.77867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853301.77899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853301.77940: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11762 1726853301.77944: when evaluation is False, skipping this task 11762 1726853301.77946: _execute() done 11762 1726853301.77951: dumping result to json 11762 1726853301.77954: done dumping result, returning 11762 1726853301.77964: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-d845-03d0-000000000a32] 11762 1726853301.77975: sending task result for task 02083763-bbaf-d845-03d0-000000000a32 11762 1726853301.78063: done sending task result for task 02083763-bbaf-d845-03d0-000000000a32 11762 1726853301.78066: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11762 1726853301.78133: no more pending results, returning what we have 11762 1726853301.78137: results queue empty 11762 1726853301.78138: checking for any_errors_fatal 11762 1726853301.78143: done checking for any_errors_fatal 11762 1726853301.78144: checking for max_fail_percentage 11762 1726853301.78146: done checking for max_fail_percentage 11762 1726853301.78147: checking to see if all hosts have failed and the running result is not ok 11762 1726853301.78148: done checking to see if all hosts have failed 11762 1726853301.78148: getting the remaining hosts for this loop 11762 1726853301.78150: done getting the remaining hosts for this loop 11762 1726853301.78153: getting the next task for host managed_node2 11762 1726853301.78160: done getting next task for host managed_node2 11762 1726853301.78163: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11762 1726853301.78168: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853301.78189: getting variables 11762 1726853301.78191: in VariableManager get_vars() 11762 1726853301.78234: Calling all_inventory to load vars for managed_node2 11762 1726853301.78236: Calling groups_inventory to load vars for managed_node2 11762 1726853301.78238: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853301.78247: Calling all_plugins_play to load vars for managed_node2 11762 1726853301.78250: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853301.78253: Calling groups_plugins_play to load vars for managed_node2 11762 1726853301.80801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853301.83878: done with get_vars() 11762 1726853301.83926: done getting variables 11762 1726853301.83997: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:28:21 -0400 (0:00:00.153) 0:00:52.270 ****** 11762 1726853301.84035: entering _queue_task() for managed_node2/dnf 11762 1726853301.84521: worker is 1 (out of 1 available) 11762 1726853301.84535: exiting _queue_task() for managed_node2/dnf 11762 1726853301.84547: done queuing things up, now waiting for results queue to drain 11762 1726853301.84549: waiting for pending results... 11762 1726853301.84849: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11762 1726853301.84948: in run() - task 02083763-bbaf-d845-03d0-000000000a33 11762 1726853301.84972: variable 'ansible_search_path' from source: unknown 11762 1726853301.84983: variable 'ansible_search_path' from source: unknown 11762 1726853301.85031: calling self._execute() 11762 1726853301.85136: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853301.85149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853301.85272: variable 'omit' from source: magic vars 11762 1726853301.85673: variable 'ansible_distribution_major_version' from source: facts 11762 1726853301.85706: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853301.86149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853301.89105: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853301.89180: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853301.89223: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853301.89264: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853301.89294: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853301.89376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853301.89489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853301.89525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853301.89577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853301.89601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853301.89736: variable 'ansible_distribution' from source: facts 11762 1726853301.89795: variable 'ansible_distribution_major_version' from source: facts 11762 1726853301.89799: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11762 1726853301.89906: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853301.90049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853301.90085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853301.90124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853301.90230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853301.90234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853301.90237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853301.90268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853301.90301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853301.90353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853301.90376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853301.90420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853301.90676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853301.90680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853301.90683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853301.90685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853301.90698: variable 'network_connections' from source: task vars 11762 1726853301.90717: variable 'controller_profile' from source: play vars 11762 1726853301.90784: variable 'controller_profile' from source: play vars 11762 1726853301.90808: variable 'controller_device' from source: play vars 11762 1726853301.90872: variable 'controller_device' from source: play vars 11762 1726853301.90885: variable 'dhcp_interface1' from source: play vars 11762 1726853301.90943: variable 'dhcp_interface1' from source: play vars 11762 1726853301.90957: variable 'port1_profile' from source: play vars 11762 1726853301.91017: variable 'port1_profile' from source: play vars 11762 1726853301.91033: variable 'dhcp_interface1' from source: play vars 11762 1726853301.91088: variable 'dhcp_interface1' from source: play vars 11762 1726853301.91099: variable 'controller_profile' from source: play vars 11762 1726853301.91168: variable 'controller_profile' from source: play vars 11762 1726853301.91182: variable 'port2_profile' from source: play vars 11762 1726853301.91249: variable 'port2_profile' from source: play vars 11762 1726853301.91261: variable 'dhcp_interface2' from source: play vars 11762 1726853301.91324: variable 'dhcp_interface2' from source: play vars 11762 1726853301.91354: variable 'controller_profile' from source: play vars 11762 1726853301.91455: variable 'controller_profile' from source: play vars 11762 1726853301.91500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853301.91688: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853301.91729: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853301.91764: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853301.91805: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853301.91888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853301.91900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853301.91928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853301.91957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853301.92077: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853301.92290: variable 'network_connections' from source: task vars 11762 1726853301.92300: variable 'controller_profile' from source: play vars 11762 1726853301.92369: variable 'controller_profile' from source: play vars 11762 1726853301.92383: variable 'controller_device' from source: play vars 11762 1726853301.92449: variable 'controller_device' from source: play vars 11762 1726853301.92462: variable 'dhcp_interface1' from source: play vars 11762 1726853301.92523: variable 'dhcp_interface1' from source: play vars 11762 1726853301.92650: variable 'port1_profile' from source: play vars 11762 1726853301.92653: variable 'port1_profile' from source: play vars 11762 1726853301.92656: variable 'dhcp_interface1' from source: play vars 11762 1726853301.92677: variable 'dhcp_interface1' from source: play vars 11762 1726853301.92688: variable 'controller_profile' from source: play vars 11762 1726853301.92748: variable 'controller_profile' from source: play vars 11762 1726853301.92766: variable 'port2_profile' from source: play vars 11762 1726853301.92828: variable 'port2_profile' from source: play vars 11762 1726853301.92839: variable 'dhcp_interface2' from source: play vars 11762 1726853301.92908: variable 'dhcp_interface2' from source: play vars 11762 1726853301.92918: variable 'controller_profile' from source: play vars 11762 1726853301.92986: variable 'controller_profile' from source: play vars 11762 1726853301.93022: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853301.93030: when evaluation is False, skipping this task 11762 1726853301.93037: _execute() done 11762 1726853301.93043: dumping result to json 11762 1726853301.93049: done dumping result, returning 11762 1726853301.93060: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000a33] 11762 1726853301.93069: sending task result for task 02083763-bbaf-d845-03d0-000000000a33 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853301.93238: no more pending results, returning what we have 11762 1726853301.93243: results queue empty 11762 1726853301.93244: checking for any_errors_fatal 11762 1726853301.93249: done checking for any_errors_fatal 11762 1726853301.93250: checking for max_fail_percentage 11762 1726853301.93252: done checking for max_fail_percentage 11762 1726853301.93253: checking to see if all hosts have failed and the running result is not ok 11762 1726853301.93254: done checking to see if all hosts have failed 11762 1726853301.93254: getting the remaining hosts for this loop 11762 1726853301.93256: done getting the remaining hosts for this loop 11762 1726853301.93259: getting the next task for host managed_node2 11762 1726853301.93267: done getting next task for host managed_node2 11762 1726853301.93272: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11762 1726853301.93278: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853301.93299: getting variables 11762 1726853301.93300: in VariableManager get_vars() 11762 1726853301.93346: Calling all_inventory to load vars for managed_node2 11762 1726853301.93349: Calling groups_inventory to load vars for managed_node2 11762 1726853301.93351: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853301.93361: Calling all_plugins_play to load vars for managed_node2 11762 1726853301.93364: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853301.93367: Calling groups_plugins_play to load vars for managed_node2 11762 1726853301.94205: done sending task result for task 02083763-bbaf-d845-03d0-000000000a33 11762 1726853301.94209: WORKER PROCESS EXITING 11762 1726853301.95081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853301.96865: done with get_vars() 11762 1726853301.96888: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11762 1726853301.96966: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:28:21 -0400 (0:00:00.129) 0:00:52.400 ****** 11762 1726853301.97000: entering _queue_task() for managed_node2/yum 11762 1726853301.97498: worker is 1 (out of 1 available) 11762 1726853301.97508: exiting _queue_task() for managed_node2/yum 11762 1726853301.97518: done queuing things up, now waiting for results queue to drain 11762 1726853301.97520: waiting for pending results... 11762 1726853301.97633: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11762 1726853301.97797: in run() - task 02083763-bbaf-d845-03d0-000000000a34 11762 1726853301.97818: variable 'ansible_search_path' from source: unknown 11762 1726853301.97828: variable 'ansible_search_path' from source: unknown 11762 1726853301.97876: calling self._execute() 11762 1726853301.97973: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853301.98080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853301.98083: variable 'omit' from source: magic vars 11762 1726853301.98378: variable 'ansible_distribution_major_version' from source: facts 11762 1726853301.98402: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853301.98593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853302.00907: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853302.00975: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853302.01023: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853302.01062: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853302.01098: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853302.01189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.01241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.01274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.01318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.01343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.01560: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.01563: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11762 1726853302.01566: when evaluation is False, skipping this task 11762 1726853302.01568: _execute() done 11762 1726853302.01572: dumping result to json 11762 1726853302.01575: done dumping result, returning 11762 1726853302.01578: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000a34] 11762 1726853302.01581: sending task result for task 02083763-bbaf-d845-03d0-000000000a34 11762 1726853302.01875: done sending task result for task 02083763-bbaf-d845-03d0-000000000a34 11762 1726853302.01881: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11762 1726853302.01933: no more pending results, returning what we have 11762 1726853302.01937: results queue empty 11762 1726853302.01938: checking for any_errors_fatal 11762 1726853302.01943: done checking for any_errors_fatal 11762 1726853302.01944: checking for max_fail_percentage 11762 1726853302.01946: done checking for max_fail_percentage 11762 1726853302.01947: checking to see if all hosts have failed and the running result is not ok 11762 1726853302.01948: done checking to see if all hosts have failed 11762 1726853302.01949: getting the remaining hosts for this loop 11762 1726853302.01951: done getting the remaining hosts for this loop 11762 1726853302.01954: getting the next task for host managed_node2 11762 1726853302.01961: done getting next task for host managed_node2 11762 1726853302.01966: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11762 1726853302.01973: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853302.01995: getting variables 11762 1726853302.01997: in VariableManager get_vars() 11762 1726853302.02043: Calling all_inventory to load vars for managed_node2 11762 1726853302.02046: Calling groups_inventory to load vars for managed_node2 11762 1726853302.02048: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853302.02058: Calling all_plugins_play to load vars for managed_node2 11762 1726853302.02061: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853302.02064: Calling groups_plugins_play to load vars for managed_node2 11762 1726853302.03519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853302.05143: done with get_vars() 11762 1726853302.05178: done getting variables 11762 1726853302.05251: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:28:22 -0400 (0:00:00.082) 0:00:52.483 ****** 11762 1726853302.05294: entering _queue_task() for managed_node2/fail 11762 1726853302.05805: worker is 1 (out of 1 available) 11762 1726853302.05818: exiting _queue_task() for managed_node2/fail 11762 1726853302.05830: done queuing things up, now waiting for results queue to drain 11762 1726853302.05832: waiting for pending results... 11762 1726853302.06033: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11762 1726853302.06211: in run() - task 02083763-bbaf-d845-03d0-000000000a35 11762 1726853302.06233: variable 'ansible_search_path' from source: unknown 11762 1726853302.06240: variable 'ansible_search_path' from source: unknown 11762 1726853302.06291: calling self._execute() 11762 1726853302.06391: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853302.06402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853302.06415: variable 'omit' from source: magic vars 11762 1726853302.06811: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.06828: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853302.06953: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853302.07162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853302.09329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853302.09424: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853302.09447: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853302.09491: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853302.09573: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853302.09613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.09665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.09702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.09776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.09779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.09826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.09858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.09890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.09965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.09969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.10003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.10036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.10066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.10123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.10178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.10342: variable 'network_connections' from source: task vars 11762 1726853302.10362: variable 'controller_profile' from source: play vars 11762 1726853302.10436: variable 'controller_profile' from source: play vars 11762 1726853302.10459: variable 'controller_device' from source: play vars 11762 1726853302.10531: variable 'controller_device' from source: play vars 11762 1726853302.10546: variable 'dhcp_interface1' from source: play vars 11762 1726853302.10666: variable 'dhcp_interface1' from source: play vars 11762 1726853302.10669: variable 'port1_profile' from source: play vars 11762 1726853302.10696: variable 'port1_profile' from source: play vars 11762 1726853302.10707: variable 'dhcp_interface1' from source: play vars 11762 1726853302.10773: variable 'dhcp_interface1' from source: play vars 11762 1726853302.10877: variable 'controller_profile' from source: play vars 11762 1726853302.10880: variable 'controller_profile' from source: play vars 11762 1726853302.10884: variable 'port2_profile' from source: play vars 11762 1726853302.10922: variable 'port2_profile' from source: play vars 11762 1726853302.10935: variable 'dhcp_interface2' from source: play vars 11762 1726853302.11005: variable 'dhcp_interface2' from source: play vars 11762 1726853302.11018: variable 'controller_profile' from source: play vars 11762 1726853302.11082: variable 'controller_profile' from source: play vars 11762 1726853302.11162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853302.11329: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853302.11359: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853302.11383: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853302.11406: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853302.11444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853302.11467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853302.11487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.11508: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853302.11562: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853302.11980: variable 'network_connections' from source: task vars 11762 1726853302.11984: variable 'controller_profile' from source: play vars 11762 1726853302.12026: variable 'controller_profile' from source: play vars 11762 1726853302.12033: variable 'controller_device' from source: play vars 11762 1726853302.12075: variable 'controller_device' from source: play vars 11762 1726853302.12083: variable 'dhcp_interface1' from source: play vars 11762 1726853302.12126: variable 'dhcp_interface1' from source: play vars 11762 1726853302.12132: variable 'port1_profile' from source: play vars 11762 1726853302.12175: variable 'port1_profile' from source: play vars 11762 1726853302.12180: variable 'dhcp_interface1' from source: play vars 11762 1726853302.12225: variable 'dhcp_interface1' from source: play vars 11762 1726853302.12230: variable 'controller_profile' from source: play vars 11762 1726853302.12274: variable 'controller_profile' from source: play vars 11762 1726853302.12280: variable 'port2_profile' from source: play vars 11762 1726853302.12323: variable 'port2_profile' from source: play vars 11762 1726853302.12329: variable 'dhcp_interface2' from source: play vars 11762 1726853302.12373: variable 'dhcp_interface2' from source: play vars 11762 1726853302.12378: variable 'controller_profile' from source: play vars 11762 1726853302.12424: variable 'controller_profile' from source: play vars 11762 1726853302.12447: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853302.12450: when evaluation is False, skipping this task 11762 1726853302.12453: _execute() done 11762 1726853302.12455: dumping result to json 11762 1726853302.12457: done dumping result, returning 11762 1726853302.12463: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000a35] 11762 1726853302.12468: sending task result for task 02083763-bbaf-d845-03d0-000000000a35 11762 1726853302.12560: done sending task result for task 02083763-bbaf-d845-03d0-000000000a35 11762 1726853302.12562: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853302.12613: no more pending results, returning what we have 11762 1726853302.12617: results queue empty 11762 1726853302.12618: checking for any_errors_fatal 11762 1726853302.12624: done checking for any_errors_fatal 11762 1726853302.12624: checking for max_fail_percentage 11762 1726853302.12626: done checking for max_fail_percentage 11762 1726853302.12627: checking to see if all hosts have failed and the running result is not ok 11762 1726853302.12628: done checking to see if all hosts have failed 11762 1726853302.12628: getting the remaining hosts for this loop 11762 1726853302.12631: done getting the remaining hosts for this loop 11762 1726853302.12634: getting the next task for host managed_node2 11762 1726853302.12641: done getting next task for host managed_node2 11762 1726853302.12647: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11762 1726853302.12652: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853302.12672: getting variables 11762 1726853302.12674: in VariableManager get_vars() 11762 1726853302.12716: Calling all_inventory to load vars for managed_node2 11762 1726853302.12719: Calling groups_inventory to load vars for managed_node2 11762 1726853302.12721: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853302.12730: Calling all_plugins_play to load vars for managed_node2 11762 1726853302.12732: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853302.12735: Calling groups_plugins_play to load vars for managed_node2 11762 1726853302.14154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853302.15036: done with get_vars() 11762 1726853302.15057: done getting variables 11762 1726853302.15103: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:28:22 -0400 (0:00:00.098) 0:00:52.581 ****** 11762 1726853302.15131: entering _queue_task() for managed_node2/package 11762 1726853302.15395: worker is 1 (out of 1 available) 11762 1726853302.15409: exiting _queue_task() for managed_node2/package 11762 1726853302.15424: done queuing things up, now waiting for results queue to drain 11762 1726853302.15426: waiting for pending results... 11762 1726853302.15613: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 11762 1726853302.15727: in run() - task 02083763-bbaf-d845-03d0-000000000a36 11762 1726853302.15739: variable 'ansible_search_path' from source: unknown 11762 1726853302.15746: variable 'ansible_search_path' from source: unknown 11762 1726853302.15777: calling self._execute() 11762 1726853302.15848: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853302.15852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853302.15861: variable 'omit' from source: magic vars 11762 1726853302.16141: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.16152: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853302.16477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853302.16587: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853302.16637: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853302.16672: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853302.16748: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853302.16861: variable 'network_packages' from source: role '' defaults 11762 1726853302.16965: variable '__network_provider_setup' from source: role '' defaults 11762 1726853302.16988: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853302.17065: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853302.17087: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853302.17164: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853302.17362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853302.18830: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853302.18874: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853302.18904: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853302.18929: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853302.18950: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853302.19011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.19032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.19052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.19080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.19094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.19125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.19140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.19159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.19185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.19199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.19342: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11762 1726853302.19418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.19434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.19454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.19480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.19490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.19553: variable 'ansible_python' from source: facts 11762 1726853302.19567: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11762 1726853302.19622: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853302.19682: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853302.19818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.19824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.19852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.19901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.19917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.19964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.20076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.20079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.20081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.20084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.20214: variable 'network_connections' from source: task vars 11762 1726853302.20228: variable 'controller_profile' from source: play vars 11762 1726853302.20335: variable 'controller_profile' from source: play vars 11762 1726853302.20354: variable 'controller_device' from source: play vars 11762 1726853302.20460: variable 'controller_device' from source: play vars 11762 1726853302.20483: variable 'dhcp_interface1' from source: play vars 11762 1726853302.20560: variable 'dhcp_interface1' from source: play vars 11762 1726853302.20568: variable 'port1_profile' from source: play vars 11762 1726853302.20663: variable 'port1_profile' from source: play vars 11762 1726853302.20672: variable 'dhcp_interface1' from source: play vars 11762 1726853302.20875: variable 'dhcp_interface1' from source: play vars 11762 1726853302.20878: variable 'controller_profile' from source: play vars 11762 1726853302.20880: variable 'controller_profile' from source: play vars 11762 1726853302.20882: variable 'port2_profile' from source: play vars 11762 1726853302.20968: variable 'port2_profile' from source: play vars 11762 1726853302.20986: variable 'dhcp_interface2' from source: play vars 11762 1726853302.21092: variable 'dhcp_interface2' from source: play vars 11762 1726853302.21106: variable 'controller_profile' from source: play vars 11762 1726853302.21216: variable 'controller_profile' from source: play vars 11762 1726853302.21293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853302.21323: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853302.21359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.21399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853302.21457: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853302.21756: variable 'network_connections' from source: task vars 11762 1726853302.21769: variable 'controller_profile' from source: play vars 11762 1726853302.21864: variable 'controller_profile' from source: play vars 11762 1726853302.21874: variable 'controller_device' from source: play vars 11762 1726853302.21943: variable 'controller_device' from source: play vars 11762 1726853302.21953: variable 'dhcp_interface1' from source: play vars 11762 1726853302.22010: variable 'dhcp_interface1' from source: play vars 11762 1726853302.22022: variable 'port1_profile' from source: play vars 11762 1726853302.22089: variable 'port1_profile' from source: play vars 11762 1726853302.22097: variable 'dhcp_interface1' from source: play vars 11762 1726853302.22168: variable 'dhcp_interface1' from source: play vars 11762 1726853302.22173: variable 'controller_profile' from source: play vars 11762 1726853302.22241: variable 'controller_profile' from source: play vars 11762 1726853302.22251: variable 'port2_profile' from source: play vars 11762 1726853302.22318: variable 'port2_profile' from source: play vars 11762 1726853302.22325: variable 'dhcp_interface2' from source: play vars 11762 1726853302.22397: variable 'dhcp_interface2' from source: play vars 11762 1726853302.22405: variable 'controller_profile' from source: play vars 11762 1726853302.22475: variable 'controller_profile' from source: play vars 11762 1726853302.22514: variable '__network_packages_default_wireless' from source: role '' defaults 11762 1726853302.22573: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853302.22768: variable 'network_connections' from source: task vars 11762 1726853302.22774: variable 'controller_profile' from source: play vars 11762 1726853302.22820: variable 'controller_profile' from source: play vars 11762 1726853302.22825: variable 'controller_device' from source: play vars 11762 1726853302.22872: variable 'controller_device' from source: play vars 11762 1726853302.22879: variable 'dhcp_interface1' from source: play vars 11762 1726853302.22929: variable 'dhcp_interface1' from source: play vars 11762 1726853302.22932: variable 'port1_profile' from source: play vars 11762 1726853302.22980: variable 'port1_profile' from source: play vars 11762 1726853302.22986: variable 'dhcp_interface1' from source: play vars 11762 1726853302.23033: variable 'dhcp_interface1' from source: play vars 11762 1726853302.23036: variable 'controller_profile' from source: play vars 11762 1726853302.23085: variable 'controller_profile' from source: play vars 11762 1726853302.23092: variable 'port2_profile' from source: play vars 11762 1726853302.23137: variable 'port2_profile' from source: play vars 11762 1726853302.23143: variable 'dhcp_interface2' from source: play vars 11762 1726853302.23190: variable 'dhcp_interface2' from source: play vars 11762 1726853302.23196: variable 'controller_profile' from source: play vars 11762 1726853302.23242: variable 'controller_profile' from source: play vars 11762 1726853302.23264: variable '__network_packages_default_team' from source: role '' defaults 11762 1726853302.23319: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853302.23514: variable 'network_connections' from source: task vars 11762 1726853302.23517: variable 'controller_profile' from source: play vars 11762 1726853302.23565: variable 'controller_profile' from source: play vars 11762 1726853302.23572: variable 'controller_device' from source: play vars 11762 1726853302.23616: variable 'controller_device' from source: play vars 11762 1726853302.23623: variable 'dhcp_interface1' from source: play vars 11762 1726853302.23876: variable 'dhcp_interface1' from source: play vars 11762 1726853302.23880: variable 'port1_profile' from source: play vars 11762 1726853302.23883: variable 'port1_profile' from source: play vars 11762 1726853302.23885: variable 'dhcp_interface1' from source: play vars 11762 1726853302.23887: variable 'dhcp_interface1' from source: play vars 11762 1726853302.23888: variable 'controller_profile' from source: play vars 11762 1726853302.23904: variable 'controller_profile' from source: play vars 11762 1726853302.23915: variable 'port2_profile' from source: play vars 11762 1726853302.23984: variable 'port2_profile' from source: play vars 11762 1726853302.23996: variable 'dhcp_interface2' from source: play vars 11762 1726853302.24063: variable 'dhcp_interface2' from source: play vars 11762 1726853302.24079: variable 'controller_profile' from source: play vars 11762 1726853302.24139: variable 'controller_profile' from source: play vars 11762 1726853302.24206: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853302.24286: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853302.24303: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853302.24367: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853302.24586: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11762 1726853302.24945: variable 'network_connections' from source: task vars 11762 1726853302.24958: variable 'controller_profile' from source: play vars 11762 1726853302.25000: variable 'controller_profile' from source: play vars 11762 1726853302.25006: variable 'controller_device' from source: play vars 11762 1726853302.25051: variable 'controller_device' from source: play vars 11762 1726853302.25058: variable 'dhcp_interface1' from source: play vars 11762 1726853302.25099: variable 'dhcp_interface1' from source: play vars 11762 1726853302.25106: variable 'port1_profile' from source: play vars 11762 1726853302.25149: variable 'port1_profile' from source: play vars 11762 1726853302.25155: variable 'dhcp_interface1' from source: play vars 11762 1726853302.25198: variable 'dhcp_interface1' from source: play vars 11762 1726853302.25203: variable 'controller_profile' from source: play vars 11762 1726853302.25243: variable 'controller_profile' from source: play vars 11762 1726853302.25255: variable 'port2_profile' from source: play vars 11762 1726853302.25295: variable 'port2_profile' from source: play vars 11762 1726853302.25301: variable 'dhcp_interface2' from source: play vars 11762 1726853302.25340: variable 'dhcp_interface2' from source: play vars 11762 1726853302.25348: variable 'controller_profile' from source: play vars 11762 1726853302.25392: variable 'controller_profile' from source: play vars 11762 1726853302.25398: variable 'ansible_distribution' from source: facts 11762 1726853302.25401: variable '__network_rh_distros' from source: role '' defaults 11762 1726853302.25407: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.25426: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11762 1726853302.25532: variable 'ansible_distribution' from source: facts 11762 1726853302.25535: variable '__network_rh_distros' from source: role '' defaults 11762 1726853302.25539: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.25553: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11762 1726853302.25659: variable 'ansible_distribution' from source: facts 11762 1726853302.25662: variable '__network_rh_distros' from source: role '' defaults 11762 1726853302.25667: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.25697: variable 'network_provider' from source: set_fact 11762 1726853302.25709: variable 'ansible_facts' from source: unknown 11762 1726853302.26149: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11762 1726853302.26152: when evaluation is False, skipping this task 11762 1726853302.26155: _execute() done 11762 1726853302.26157: dumping result to json 11762 1726853302.26159: done dumping result, returning 11762 1726853302.26166: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-d845-03d0-000000000a36] 11762 1726853302.26172: sending task result for task 02083763-bbaf-d845-03d0-000000000a36 11762 1726853302.26262: done sending task result for task 02083763-bbaf-d845-03d0-000000000a36 11762 1726853302.26265: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11762 1726853302.26316: no more pending results, returning what we have 11762 1726853302.26320: results queue empty 11762 1726853302.26321: checking for any_errors_fatal 11762 1726853302.26327: done checking for any_errors_fatal 11762 1726853302.26327: checking for max_fail_percentage 11762 1726853302.26329: done checking for max_fail_percentage 11762 1726853302.26330: checking to see if all hosts have failed and the running result is not ok 11762 1726853302.26331: done checking to see if all hosts have failed 11762 1726853302.26331: getting the remaining hosts for this loop 11762 1726853302.26333: done getting the remaining hosts for this loop 11762 1726853302.26336: getting the next task for host managed_node2 11762 1726853302.26344: done getting next task for host managed_node2 11762 1726853302.26348: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11762 1726853302.26353: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853302.26374: getting variables 11762 1726853302.26376: in VariableManager get_vars() 11762 1726853302.26419: Calling all_inventory to load vars for managed_node2 11762 1726853302.26421: Calling groups_inventory to load vars for managed_node2 11762 1726853302.26423: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853302.26432: Calling all_plugins_play to load vars for managed_node2 11762 1726853302.26435: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853302.26437: Calling groups_plugins_play to load vars for managed_node2 11762 1726853302.27230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853302.28092: done with get_vars() 11762 1726853302.28110: done getting variables 11762 1726853302.28152: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:28:22 -0400 (0:00:00.130) 0:00:52.712 ****** 11762 1726853302.28179: entering _queue_task() for managed_node2/package 11762 1726853302.28404: worker is 1 (out of 1 available) 11762 1726853302.28418: exiting _queue_task() for managed_node2/package 11762 1726853302.28431: done queuing things up, now waiting for results queue to drain 11762 1726853302.28433: waiting for pending results... 11762 1726853302.28615: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11762 1726853302.28716: in run() - task 02083763-bbaf-d845-03d0-000000000a37 11762 1726853302.28727: variable 'ansible_search_path' from source: unknown 11762 1726853302.28731: variable 'ansible_search_path' from source: unknown 11762 1726853302.28763: calling self._execute() 11762 1726853302.28835: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853302.28838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853302.28850: variable 'omit' from source: magic vars 11762 1726853302.29129: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.29138: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853302.29225: variable 'network_state' from source: role '' defaults 11762 1726853302.29233: Evaluated conditional (network_state != {}): False 11762 1726853302.29236: when evaluation is False, skipping this task 11762 1726853302.29239: _execute() done 11762 1726853302.29242: dumping result to json 11762 1726853302.29247: done dumping result, returning 11762 1726853302.29255: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-d845-03d0-000000000a37] 11762 1726853302.29260: sending task result for task 02083763-bbaf-d845-03d0-000000000a37 11762 1726853302.29353: done sending task result for task 02083763-bbaf-d845-03d0-000000000a37 11762 1726853302.29356: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853302.29405: no more pending results, returning what we have 11762 1726853302.29410: results queue empty 11762 1726853302.29410: checking for any_errors_fatal 11762 1726853302.29415: done checking for any_errors_fatal 11762 1726853302.29415: checking for max_fail_percentage 11762 1726853302.29417: done checking for max_fail_percentage 11762 1726853302.29418: checking to see if all hosts have failed and the running result is not ok 11762 1726853302.29419: done checking to see if all hosts have failed 11762 1726853302.29420: getting the remaining hosts for this loop 11762 1726853302.29422: done getting the remaining hosts for this loop 11762 1726853302.29425: getting the next task for host managed_node2 11762 1726853302.29432: done getting next task for host managed_node2 11762 1726853302.29436: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11762 1726853302.29441: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853302.29459: getting variables 11762 1726853302.29460: in VariableManager get_vars() 11762 1726853302.29501: Calling all_inventory to load vars for managed_node2 11762 1726853302.29503: Calling groups_inventory to load vars for managed_node2 11762 1726853302.29505: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853302.29513: Calling all_plugins_play to load vars for managed_node2 11762 1726853302.29516: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853302.29518: Calling groups_plugins_play to load vars for managed_node2 11762 1726853302.30381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853302.31224: done with get_vars() 11762 1726853302.31239: done getting variables 11762 1726853302.31280: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:28:22 -0400 (0:00:00.031) 0:00:52.743 ****** 11762 1726853302.31304: entering _queue_task() for managed_node2/package 11762 1726853302.31516: worker is 1 (out of 1 available) 11762 1726853302.31532: exiting _queue_task() for managed_node2/package 11762 1726853302.31547: done queuing things up, now waiting for results queue to drain 11762 1726853302.31549: waiting for pending results... 11762 1726853302.31860: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11762 1726853302.31966: in run() - task 02083763-bbaf-d845-03d0-000000000a38 11762 1726853302.31981: variable 'ansible_search_path' from source: unknown 11762 1726853302.31985: variable 'ansible_search_path' from source: unknown 11762 1726853302.32013: calling self._execute() 11762 1726853302.32093: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853302.32096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853302.32105: variable 'omit' from source: magic vars 11762 1726853302.32376: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.32387: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853302.32467: variable 'network_state' from source: role '' defaults 11762 1726853302.32476: Evaluated conditional (network_state != {}): False 11762 1726853302.32479: when evaluation is False, skipping this task 11762 1726853302.32482: _execute() done 11762 1726853302.32484: dumping result to json 11762 1726853302.32487: done dumping result, returning 11762 1726853302.32495: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-d845-03d0-000000000a38] 11762 1726853302.32501: sending task result for task 02083763-bbaf-d845-03d0-000000000a38 11762 1726853302.32589: done sending task result for task 02083763-bbaf-d845-03d0-000000000a38 11762 1726853302.32592: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853302.32638: no more pending results, returning what we have 11762 1726853302.32642: results queue empty 11762 1726853302.32643: checking for any_errors_fatal 11762 1726853302.32652: done checking for any_errors_fatal 11762 1726853302.32653: checking for max_fail_percentage 11762 1726853302.32655: done checking for max_fail_percentage 11762 1726853302.32656: checking to see if all hosts have failed and the running result is not ok 11762 1726853302.32656: done checking to see if all hosts have failed 11762 1726853302.32657: getting the remaining hosts for this loop 11762 1726853302.32659: done getting the remaining hosts for this loop 11762 1726853302.32661: getting the next task for host managed_node2 11762 1726853302.32669: done getting next task for host managed_node2 11762 1726853302.32673: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11762 1726853302.32678: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853302.32695: getting variables 11762 1726853302.32696: in VariableManager get_vars() 11762 1726853302.32731: Calling all_inventory to load vars for managed_node2 11762 1726853302.32733: Calling groups_inventory to load vars for managed_node2 11762 1726853302.32735: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853302.32743: Calling all_plugins_play to load vars for managed_node2 11762 1726853302.32745: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853302.32748: Calling groups_plugins_play to load vars for managed_node2 11762 1726853302.33503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853302.35022: done with get_vars() 11762 1726853302.35042: done getting variables 11762 1726853302.35100: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:28:22 -0400 (0:00:00.038) 0:00:52.781 ****** 11762 1726853302.35134: entering _queue_task() for managed_node2/service 11762 1726853302.35393: worker is 1 (out of 1 available) 11762 1726853302.35407: exiting _queue_task() for managed_node2/service 11762 1726853302.35418: done queuing things up, now waiting for results queue to drain 11762 1726853302.35421: waiting for pending results... 11762 1726853302.35798: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11762 1726853302.35841: in run() - task 02083763-bbaf-d845-03d0-000000000a39 11762 1726853302.35894: variable 'ansible_search_path' from source: unknown 11762 1726853302.35897: variable 'ansible_search_path' from source: unknown 11762 1726853302.35910: calling self._execute() 11762 1726853302.36008: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853302.36018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853302.36031: variable 'omit' from source: magic vars 11762 1726853302.36392: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.36434: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853302.36529: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853302.36726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853302.38650: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853302.38693: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853302.38722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853302.38748: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853302.38767: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853302.38828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.38851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.38869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.38896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.38906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.38941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.38960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.38978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.39003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.39014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.39045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.39063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.39081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.39104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.39114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.39224: variable 'network_connections' from source: task vars 11762 1726853302.39235: variable 'controller_profile' from source: play vars 11762 1726853302.39285: variable 'controller_profile' from source: play vars 11762 1726853302.39295: variable 'controller_device' from source: play vars 11762 1726853302.39576: variable 'controller_device' from source: play vars 11762 1726853302.39579: variable 'dhcp_interface1' from source: play vars 11762 1726853302.39581: variable 'dhcp_interface1' from source: play vars 11762 1726853302.39583: variable 'port1_profile' from source: play vars 11762 1726853302.39585: variable 'port1_profile' from source: play vars 11762 1726853302.39587: variable 'dhcp_interface1' from source: play vars 11762 1726853302.39589: variable 'dhcp_interface1' from source: play vars 11762 1726853302.39591: variable 'controller_profile' from source: play vars 11762 1726853302.39651: variable 'controller_profile' from source: play vars 11762 1726853302.39663: variable 'port2_profile' from source: play vars 11762 1726853302.39737: variable 'port2_profile' from source: play vars 11762 1726853302.39753: variable 'dhcp_interface2' from source: play vars 11762 1726853302.39825: variable 'dhcp_interface2' from source: play vars 11762 1726853302.39836: variable 'controller_profile' from source: play vars 11762 1726853302.39900: variable 'controller_profile' from source: play vars 11762 1726853302.39975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853302.40138: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853302.40185: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853302.40217: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853302.40253: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853302.40307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853302.40331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853302.40362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.40433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853302.40514: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853302.40750: variable 'network_connections' from source: task vars 11762 1726853302.40762: variable 'controller_profile' from source: play vars 11762 1726853302.40832: variable 'controller_profile' from source: play vars 11762 1726853302.40846: variable 'controller_device' from source: play vars 11762 1726853302.41076: variable 'controller_device' from source: play vars 11762 1726853302.41079: variable 'dhcp_interface1' from source: play vars 11762 1726853302.41081: variable 'dhcp_interface1' from source: play vars 11762 1726853302.41083: variable 'port1_profile' from source: play vars 11762 1726853302.41086: variable 'port1_profile' from source: play vars 11762 1726853302.41088: variable 'dhcp_interface1' from source: play vars 11762 1726853302.41121: variable 'dhcp_interface1' from source: play vars 11762 1726853302.41132: variable 'controller_profile' from source: play vars 11762 1726853302.41197: variable 'controller_profile' from source: play vars 11762 1726853302.41209: variable 'port2_profile' from source: play vars 11762 1726853302.41272: variable 'port2_profile' from source: play vars 11762 1726853302.41284: variable 'dhcp_interface2' from source: play vars 11762 1726853302.41346: variable 'dhcp_interface2' from source: play vars 11762 1726853302.41358: variable 'controller_profile' from source: play vars 11762 1726853302.41421: variable 'controller_profile' from source: play vars 11762 1726853302.41460: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853302.41469: when evaluation is False, skipping this task 11762 1726853302.41478: _execute() done 11762 1726853302.41485: dumping result to json 11762 1726853302.41492: done dumping result, returning 11762 1726853302.41501: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000a39] 11762 1726853302.41510: sending task result for task 02083763-bbaf-d845-03d0-000000000a39 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853302.41654: no more pending results, returning what we have 11762 1726853302.41658: results queue empty 11762 1726853302.41658: checking for any_errors_fatal 11762 1726853302.41664: done checking for any_errors_fatal 11762 1726853302.41664: checking for max_fail_percentage 11762 1726853302.41666: done checking for max_fail_percentage 11762 1726853302.41667: checking to see if all hosts have failed and the running result is not ok 11762 1726853302.41668: done checking to see if all hosts have failed 11762 1726853302.41668: getting the remaining hosts for this loop 11762 1726853302.41670: done getting the remaining hosts for this loop 11762 1726853302.41675: getting the next task for host managed_node2 11762 1726853302.41687: done getting next task for host managed_node2 11762 1726853302.41690: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11762 1726853302.41695: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853302.41723: getting variables 11762 1726853302.41724: in VariableManager get_vars() 11762 1726853302.41764: Calling all_inventory to load vars for managed_node2 11762 1726853302.41766: Calling groups_inventory to load vars for managed_node2 11762 1726853302.41768: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853302.41986: Calling all_plugins_play to load vars for managed_node2 11762 1726853302.41990: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853302.42041: done sending task result for task 02083763-bbaf-d845-03d0-000000000a39 11762 1726853302.42047: WORKER PROCESS EXITING 11762 1726853302.42052: Calling groups_plugins_play to load vars for managed_node2 11762 1726853302.43864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853302.45616: done with get_vars() 11762 1726853302.45670: done getting variables 11762 1726853302.45729: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:28:22 -0400 (0:00:00.109) 0:00:52.891 ****** 11762 1726853302.46135: entering _queue_task() for managed_node2/service 11762 1726853302.46861: worker is 1 (out of 1 available) 11762 1726853302.47277: exiting _queue_task() for managed_node2/service 11762 1726853302.47289: done queuing things up, now waiting for results queue to drain 11762 1726853302.47290: waiting for pending results... 11762 1726853302.47312: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11762 1726853302.47318: in run() - task 02083763-bbaf-d845-03d0-000000000a3a 11762 1726853302.47321: variable 'ansible_search_path' from source: unknown 11762 1726853302.47324: variable 'ansible_search_path' from source: unknown 11762 1726853302.47326: calling self._execute() 11762 1726853302.47385: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853302.47396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853302.47409: variable 'omit' from source: magic vars 11762 1726853302.47767: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.47785: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853302.47948: variable 'network_provider' from source: set_fact 11762 1726853302.48176: variable 'network_state' from source: role '' defaults 11762 1726853302.48179: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11762 1726853302.48181: variable 'omit' from source: magic vars 11762 1726853302.48184: variable 'omit' from source: magic vars 11762 1726853302.48186: variable 'network_service_name' from source: role '' defaults 11762 1726853302.48188: variable 'network_service_name' from source: role '' defaults 11762 1726853302.48254: variable '__network_provider_setup' from source: role '' defaults 11762 1726853302.48265: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853302.48330: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853302.48345: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853302.48408: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853302.48623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853302.50650: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853302.50715: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853302.50755: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853302.50808: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853302.50836: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853302.50918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.50952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.50985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.51028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.51047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.51095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.51121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.51147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.51191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.51208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.51427: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11762 1726853302.51537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.51564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.51594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.51634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.51652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.51744: variable 'ansible_python' from source: facts 11762 1726853302.51765: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11762 1726853302.51845: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853302.51925: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853302.52047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.52078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.52105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.52145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.52163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.52212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853302.52249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853302.52279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.52377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853302.52381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853302.52469: variable 'network_connections' from source: task vars 11762 1726853302.52485: variable 'controller_profile' from source: play vars 11762 1726853302.52557: variable 'controller_profile' from source: play vars 11762 1726853302.52578: variable 'controller_device' from source: play vars 11762 1726853302.52655: variable 'controller_device' from source: play vars 11762 1726853302.52676: variable 'dhcp_interface1' from source: play vars 11762 1726853302.52746: variable 'dhcp_interface1' from source: play vars 11762 1726853302.52764: variable 'port1_profile' from source: play vars 11762 1726853302.52837: variable 'port1_profile' from source: play vars 11762 1726853302.52976: variable 'dhcp_interface1' from source: play vars 11762 1726853302.52979: variable 'dhcp_interface1' from source: play vars 11762 1726853302.52981: variable 'controller_profile' from source: play vars 11762 1726853302.53010: variable 'controller_profile' from source: play vars 11762 1726853302.53026: variable 'port2_profile' from source: play vars 11762 1726853302.53103: variable 'port2_profile' from source: play vars 11762 1726853302.53119: variable 'dhcp_interface2' from source: play vars 11762 1726853302.53193: variable 'dhcp_interface2' from source: play vars 11762 1726853302.53208: variable 'controller_profile' from source: play vars 11762 1726853302.53282: variable 'controller_profile' from source: play vars 11762 1726853302.53385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853302.53559: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853302.53613: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853302.53663: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853302.53709: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853302.53774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853302.53803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853302.53834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853302.53866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853302.53923: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853302.54578: variable 'network_connections' from source: task vars 11762 1726853302.54582: variable 'controller_profile' from source: play vars 11762 1726853302.54585: variable 'controller_profile' from source: play vars 11762 1726853302.54587: variable 'controller_device' from source: play vars 11762 1726853302.54589: variable 'controller_device' from source: play vars 11762 1726853302.54591: variable 'dhcp_interface1' from source: play vars 11762 1726853302.54797: variable 'dhcp_interface1' from source: play vars 11762 1726853302.54815: variable 'port1_profile' from source: play vars 11762 1726853302.55177: variable 'port1_profile' from source: play vars 11762 1726853302.55180: variable 'dhcp_interface1' from source: play vars 11762 1726853302.55183: variable 'dhcp_interface1' from source: play vars 11762 1726853302.55193: variable 'controller_profile' from source: play vars 11762 1726853302.55264: variable 'controller_profile' from source: play vars 11762 1726853302.55477: variable 'port2_profile' from source: play vars 11762 1726853302.55480: variable 'port2_profile' from source: play vars 11762 1726853302.55483: variable 'dhcp_interface2' from source: play vars 11762 1726853302.55638: variable 'dhcp_interface2' from source: play vars 11762 1726853302.55655: variable 'controller_profile' from source: play vars 11762 1726853302.55732: variable 'controller_profile' from source: play vars 11762 1726853302.56025: variable '__network_packages_default_wireless' from source: role '' defaults 11762 1726853302.56108: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853302.56604: variable 'network_connections' from source: task vars 11762 1726853302.56877: variable 'controller_profile' from source: play vars 11762 1726853302.56880: variable 'controller_profile' from source: play vars 11762 1726853302.56883: variable 'controller_device' from source: play vars 11762 1726853302.56934: variable 'controller_device' from source: play vars 11762 1726853302.56987: variable 'dhcp_interface1' from source: play vars 11762 1726853302.57066: variable 'dhcp_interface1' from source: play vars 11762 1726853302.57097: variable 'port1_profile' from source: play vars 11762 1726853302.57166: variable 'port1_profile' from source: play vars 11762 1726853302.57182: variable 'dhcp_interface1' from source: play vars 11762 1726853302.57249: variable 'dhcp_interface1' from source: play vars 11762 1726853302.57261: variable 'controller_profile' from source: play vars 11762 1726853302.57331: variable 'controller_profile' from source: play vars 11762 1726853302.57341: variable 'port2_profile' from source: play vars 11762 1726853302.57407: variable 'port2_profile' from source: play vars 11762 1726853302.57420: variable 'dhcp_interface2' from source: play vars 11762 1726853302.57494: variable 'dhcp_interface2' from source: play vars 11762 1726853302.57506: variable 'controller_profile' from source: play vars 11762 1726853302.57579: variable 'controller_profile' from source: play vars 11762 1726853302.57611: variable '__network_packages_default_team' from source: role '' defaults 11762 1726853302.57693: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853302.57944: variable 'network_connections' from source: task vars 11762 1726853302.57954: variable 'controller_profile' from source: play vars 11762 1726853302.58026: variable 'controller_profile' from source: play vars 11762 1726853302.58037: variable 'controller_device' from source: play vars 11762 1726853302.58108: variable 'controller_device' from source: play vars 11762 1726853302.58121: variable 'dhcp_interface1' from source: play vars 11762 1726853302.58201: variable 'dhcp_interface1' from source: play vars 11762 1726853302.58214: variable 'port1_profile' from source: play vars 11762 1726853302.58286: variable 'port1_profile' from source: play vars 11762 1726853302.58297: variable 'dhcp_interface1' from source: play vars 11762 1726853302.58364: variable 'dhcp_interface1' from source: play vars 11762 1726853302.58383: variable 'controller_profile' from source: play vars 11762 1726853302.58450: variable 'controller_profile' from source: play vars 11762 1726853302.58468: variable 'port2_profile' from source: play vars 11762 1726853302.58539: variable 'port2_profile' from source: play vars 11762 1726853302.58551: variable 'dhcp_interface2' from source: play vars 11762 1726853302.58624: variable 'dhcp_interface2' from source: play vars 11762 1726853302.58637: variable 'controller_profile' from source: play vars 11762 1726853302.58707: variable 'controller_profile' from source: play vars 11762 1726853302.58770: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853302.58833: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853302.58975: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853302.58978: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853302.59117: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11762 1726853302.59581: variable 'network_connections' from source: task vars 11762 1726853302.59591: variable 'controller_profile' from source: play vars 11762 1726853302.59651: variable 'controller_profile' from source: play vars 11762 1726853302.59662: variable 'controller_device' from source: play vars 11762 1726853302.59724: variable 'controller_device' from source: play vars 11762 1726853302.59736: variable 'dhcp_interface1' from source: play vars 11762 1726853302.59800: variable 'dhcp_interface1' from source: play vars 11762 1726853302.59812: variable 'port1_profile' from source: play vars 11762 1726853302.59875: variable 'port1_profile' from source: play vars 11762 1726853302.59887: variable 'dhcp_interface1' from source: play vars 11762 1726853302.59945: variable 'dhcp_interface1' from source: play vars 11762 1726853302.59957: variable 'controller_profile' from source: play vars 11762 1726853302.60019: variable 'controller_profile' from source: play vars 11762 1726853302.60031: variable 'port2_profile' from source: play vars 11762 1726853302.60091: variable 'port2_profile' from source: play vars 11762 1726853302.60103: variable 'dhcp_interface2' from source: play vars 11762 1726853302.60160: variable 'dhcp_interface2' from source: play vars 11762 1726853302.60173: variable 'controller_profile' from source: play vars 11762 1726853302.60233: variable 'controller_profile' from source: play vars 11762 1726853302.60247: variable 'ansible_distribution' from source: facts 11762 1726853302.60257: variable '__network_rh_distros' from source: role '' defaults 11762 1726853302.60269: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.60298: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11762 1726853302.60658: variable 'ansible_distribution' from source: facts 11762 1726853302.60666: variable '__network_rh_distros' from source: role '' defaults 11762 1726853302.60776: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.60779: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11762 1726853302.60861: variable 'ansible_distribution' from source: facts 11762 1726853302.60870: variable '__network_rh_distros' from source: role '' defaults 11762 1726853302.60881: variable 'ansible_distribution_major_version' from source: facts 11762 1726853302.60919: variable 'network_provider' from source: set_fact 11762 1726853302.60946: variable 'omit' from source: magic vars 11762 1726853302.60978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853302.61009: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853302.61032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853302.61052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853302.61066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853302.61098: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853302.61106: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853302.61113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853302.61207: Set connection var ansible_timeout to 10 11762 1726853302.61215: Set connection var ansible_shell_type to sh 11762 1726853302.61224: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853302.61233: Set connection var ansible_shell_executable to /bin/sh 11762 1726853302.61245: Set connection var ansible_pipelining to False 11762 1726853302.61256: Set connection var ansible_connection to ssh 11762 1726853302.61284: variable 'ansible_shell_executable' from source: unknown 11762 1726853302.61291: variable 'ansible_connection' from source: unknown 11762 1726853302.61375: variable 'ansible_module_compression' from source: unknown 11762 1726853302.61378: variable 'ansible_shell_type' from source: unknown 11762 1726853302.61380: variable 'ansible_shell_executable' from source: unknown 11762 1726853302.61382: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853302.61384: variable 'ansible_pipelining' from source: unknown 11762 1726853302.61386: variable 'ansible_timeout' from source: unknown 11762 1726853302.61388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853302.61429: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853302.61444: variable 'omit' from source: magic vars 11762 1726853302.61453: starting attempt loop 11762 1726853302.61460: running the handler 11762 1726853302.61538: variable 'ansible_facts' from source: unknown 11762 1726853302.62259: _low_level_execute_command(): starting 11762 1726853302.62273: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853302.62943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853302.62959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853302.63073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853302.63102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853302.63210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853302.64965: stdout chunk (state=3): >>>/root <<< 11762 1726853302.65156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853302.65204: stderr chunk (state=3): >>><<< 11762 1726853302.65509: stdout chunk (state=3): >>><<< 11762 1726853302.65513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853302.65522: _low_level_execute_command(): starting 11762 1726853302.65525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434 `" && echo ansible-tmp-1726853302.654124-14286-160040573127434="` echo /root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434 `" ) && sleep 0' 11762 1726853302.66832: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853302.67053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853302.67057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853302.67059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853302.67094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853302.67196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853302.69239: stdout chunk (state=3): >>>ansible-tmp-1726853302.654124-14286-160040573127434=/root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434 <<< 11762 1726853302.69439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853302.69448: stdout chunk (state=3): >>><<< 11762 1726853302.69455: stderr chunk (state=3): >>><<< 11762 1726853302.69476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853302.654124-14286-160040573127434=/root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853302.69516: variable 'ansible_module_compression' from source: unknown 11762 1726853302.69576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11762 1726853302.69980: variable 'ansible_facts' from source: unknown 11762 1726853302.70532: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434/AnsiballZ_systemd.py 11762 1726853302.70921: Sending initial data 11762 1726853302.70924: Sent initial data (155 bytes) 11762 1726853302.71988: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853302.72086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853302.72096: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853302.72143: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853302.72261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853302.72274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853302.72285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853302.72402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853302.74259: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853302.74275: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853302.74459: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpxcpw4e9b /root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434/AnsiballZ_systemd.py <<< 11762 1726853302.74462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434/AnsiballZ_systemd.py" <<< 11762 1726853302.74575: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpxcpw4e9b" to remote "/root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434/AnsiballZ_systemd.py" <<< 11762 1726853302.76652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853302.76876: stderr chunk (state=3): >>><<< 11762 1726853302.76880: stdout chunk (state=3): >>><<< 11762 1726853302.76882: done transferring module to remote 11762 1726853302.76884: _low_level_execute_command(): starting 11762 1726853302.76887: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434/ /root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434/AnsiballZ_systemd.py && sleep 0' 11762 1726853302.77385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853302.77389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853302.77486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853302.77535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853302.77539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853302.77569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853302.77657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853302.79621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853302.79624: stdout chunk (state=3): >>><<< 11762 1726853302.79630: stderr chunk (state=3): >>><<< 11762 1726853302.79692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853302.79699: _low_level_execute_command(): starting 11762 1726853302.79747: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434/AnsiballZ_systemd.py && sleep 0' 11762 1726853302.80277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853302.80285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853302.80296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853302.80376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853302.80383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853302.80386: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853302.80388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853302.80421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853302.80444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853302.80458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853302.80482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853302.80586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853303.10577: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4521984", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310399488", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "717389000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 11762 1726853303.10609: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11762 1726853303.12818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853303.12823: stdout chunk (state=3): >>><<< 11762 1726853303.12826: stderr chunk (state=3): >>><<< 11762 1726853303.12831: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4521984", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310399488", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "717389000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853303.13693: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853303.13699: _low_level_execute_command(): starting 11762 1726853303.13705: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853302.654124-14286-160040573127434/ > /dev/null 2>&1 && sleep 0' 11762 1726853303.14904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853303.15014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853303.15031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853303.15058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853303.15212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853303.17193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853303.17231: stderr chunk (state=3): >>><<< 11762 1726853303.17399: stdout chunk (state=3): >>><<< 11762 1726853303.17403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853303.17405: handler run complete 11762 1726853303.17575: attempt loop complete, returning result 11762 1726853303.17579: _execute() done 11762 1726853303.17581: dumping result to json 11762 1726853303.17583: done dumping result, returning 11762 1726853303.17585: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-d845-03d0-000000000a3a] 11762 1726853303.17587: sending task result for task 02083763-bbaf-d845-03d0-000000000a3a ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853303.17985: no more pending results, returning what we have 11762 1726853303.17989: results queue empty 11762 1726853303.17990: checking for any_errors_fatal 11762 1726853303.17997: done checking for any_errors_fatal 11762 1726853303.17998: checking for max_fail_percentage 11762 1726853303.18000: done checking for max_fail_percentage 11762 1726853303.18001: checking to see if all hosts have failed and the running result is not ok 11762 1726853303.18002: done checking to see if all hosts have failed 11762 1726853303.18002: getting the remaining hosts for this loop 11762 1726853303.18005: done getting the remaining hosts for this loop 11762 1726853303.18008: getting the next task for host managed_node2 11762 1726853303.18015: done getting next task for host managed_node2 11762 1726853303.18018: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11762 1726853303.18022: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853303.18033: getting variables 11762 1726853303.18035: in VariableManager get_vars() 11762 1726853303.18113: Calling all_inventory to load vars for managed_node2 11762 1726853303.18116: Calling groups_inventory to load vars for managed_node2 11762 1726853303.18118: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853303.18128: Calling all_plugins_play to load vars for managed_node2 11762 1726853303.18131: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853303.18134: Calling groups_plugins_play to load vars for managed_node2 11762 1726853303.19385: done sending task result for task 02083763-bbaf-d845-03d0-000000000a3a 11762 1726853303.19388: WORKER PROCESS EXITING 11762 1726853303.21494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853303.23968: done with get_vars() 11762 1726853303.24014: done getting variables 11762 1726853303.24088: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:28:23 -0400 (0:00:00.780) 0:00:53.671 ****** 11762 1726853303.24143: entering _queue_task() for managed_node2/service 11762 1726853303.24702: worker is 1 (out of 1 available) 11762 1726853303.24715: exiting _queue_task() for managed_node2/service 11762 1726853303.24727: done queuing things up, now waiting for results queue to drain 11762 1726853303.24729: waiting for pending results... 11762 1726853303.24858: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11762 1726853303.25519: in run() - task 02083763-bbaf-d845-03d0-000000000a3b 11762 1726853303.25606: variable 'ansible_search_path' from source: unknown 11762 1726853303.25614: variable 'ansible_search_path' from source: unknown 11762 1726853303.25655: calling self._execute() 11762 1726853303.25917: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853303.25930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853303.25943: variable 'omit' from source: magic vars 11762 1726853303.26730: variable 'ansible_distribution_major_version' from source: facts 11762 1726853303.26748: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853303.26999: variable 'network_provider' from source: set_fact 11762 1726853303.27195: Evaluated conditional (network_provider == "nm"): True 11762 1726853303.27199: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853303.27373: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853303.27791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853303.32023: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853303.32152: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853303.32195: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853303.32274: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853303.32375: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853303.32586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853303.32622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853303.32702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853303.32867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853303.32870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853303.32912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853303.33023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853303.33054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853303.33111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853303.33130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853303.33175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853303.33204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853303.33242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853303.33285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853303.33304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853303.33543: variable 'network_connections' from source: task vars 11762 1726853303.33546: variable 'controller_profile' from source: play vars 11762 1726853303.33590: variable 'controller_profile' from source: play vars 11762 1726853303.33607: variable 'controller_device' from source: play vars 11762 1726853303.33668: variable 'controller_device' from source: play vars 11762 1726853303.33683: variable 'dhcp_interface1' from source: play vars 11762 1726853303.33775: variable 'dhcp_interface1' from source: play vars 11762 1726853303.33789: variable 'port1_profile' from source: play vars 11762 1726853303.33877: variable 'port1_profile' from source: play vars 11762 1726853303.33887: variable 'dhcp_interface1' from source: play vars 11762 1726853303.33940: variable 'dhcp_interface1' from source: play vars 11762 1726853303.33950: variable 'controller_profile' from source: play vars 11762 1726853303.34197: variable 'controller_profile' from source: play vars 11762 1726853303.34201: variable 'port2_profile' from source: play vars 11762 1726853303.34204: variable 'port2_profile' from source: play vars 11762 1726853303.34206: variable 'dhcp_interface2' from source: play vars 11762 1726853303.34378: variable 'dhcp_interface2' from source: play vars 11762 1726853303.34382: variable 'controller_profile' from source: play vars 11762 1726853303.34443: variable 'controller_profile' from source: play vars 11762 1726853303.34676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853303.35118: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853303.35222: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853303.35327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853303.35361: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853303.35489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853303.35521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853303.35562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853303.35647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853303.35838: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853303.36306: variable 'network_connections' from source: task vars 11762 1726853303.36319: variable 'controller_profile' from source: play vars 11762 1726853303.36383: variable 'controller_profile' from source: play vars 11762 1726853303.36575: variable 'controller_device' from source: play vars 11762 1726853303.36578: variable 'controller_device' from source: play vars 11762 1726853303.36580: variable 'dhcp_interface1' from source: play vars 11762 1726853303.36878: variable 'dhcp_interface1' from source: play vars 11762 1726853303.36882: variable 'port1_profile' from source: play vars 11762 1726853303.36884: variable 'port1_profile' from source: play vars 11762 1726853303.36886: variable 'dhcp_interface1' from source: play vars 11762 1726853303.37002: variable 'dhcp_interface1' from source: play vars 11762 1726853303.37015: variable 'controller_profile' from source: play vars 11762 1726853303.37076: variable 'controller_profile' from source: play vars 11762 1726853303.37185: variable 'port2_profile' from source: play vars 11762 1726853303.37245: variable 'port2_profile' from source: play vars 11762 1726853303.37319: variable 'dhcp_interface2' from source: play vars 11762 1726853303.37422: variable 'dhcp_interface2' from source: play vars 11762 1726853303.37433: variable 'controller_profile' from source: play vars 11762 1726853303.37490: variable 'controller_profile' from source: play vars 11762 1726853303.37614: Evaluated conditional (__network_wpa_supplicant_required): False 11762 1726853303.37639: when evaluation is False, skipping this task 11762 1726853303.37852: _execute() done 11762 1726853303.37856: dumping result to json 11762 1726853303.37858: done dumping result, returning 11762 1726853303.37860: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-d845-03d0-000000000a3b] 11762 1726853303.37862: sending task result for task 02083763-bbaf-d845-03d0-000000000a3b 11762 1726853303.37937: done sending task result for task 02083763-bbaf-d845-03d0-000000000a3b 11762 1726853303.37940: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11762 1726853303.38004: no more pending results, returning what we have 11762 1726853303.38009: results queue empty 11762 1726853303.38010: checking for any_errors_fatal 11762 1726853303.38030: done checking for any_errors_fatal 11762 1726853303.38031: checking for max_fail_percentage 11762 1726853303.38034: done checking for max_fail_percentage 11762 1726853303.38035: checking to see if all hosts have failed and the running result is not ok 11762 1726853303.38035: done checking to see if all hosts have failed 11762 1726853303.38036: getting the remaining hosts for this loop 11762 1726853303.38038: done getting the remaining hosts for this loop 11762 1726853303.38042: getting the next task for host managed_node2 11762 1726853303.38049: done getting next task for host managed_node2 11762 1726853303.38053: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11762 1726853303.38059: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853303.38082: getting variables 11762 1726853303.38085: in VariableManager get_vars() 11762 1726853303.38130: Calling all_inventory to load vars for managed_node2 11762 1726853303.38133: Calling groups_inventory to load vars for managed_node2 11762 1726853303.38136: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853303.38147: Calling all_plugins_play to load vars for managed_node2 11762 1726853303.38150: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853303.38153: Calling groups_plugins_play to load vars for managed_node2 11762 1726853303.41033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853303.43604: done with get_vars() 11762 1726853303.43630: done getting variables 11762 1726853303.43703: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:28:23 -0400 (0:00:00.195) 0:00:53.867 ****** 11762 1726853303.43740: entering _queue_task() for managed_node2/service 11762 1726853303.44102: worker is 1 (out of 1 available) 11762 1726853303.44114: exiting _queue_task() for managed_node2/service 11762 1726853303.44128: done queuing things up, now waiting for results queue to drain 11762 1726853303.44130: waiting for pending results... 11762 1726853303.44438: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 11762 1726853303.44625: in run() - task 02083763-bbaf-d845-03d0-000000000a3c 11762 1726853303.44646: variable 'ansible_search_path' from source: unknown 11762 1726853303.44654: variable 'ansible_search_path' from source: unknown 11762 1726853303.44702: calling self._execute() 11762 1726853303.44806: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853303.44905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853303.44908: variable 'omit' from source: magic vars 11762 1726853303.45527: variable 'ansible_distribution_major_version' from source: facts 11762 1726853303.45545: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853303.45779: variable 'network_provider' from source: set_fact 11762 1726853303.45792: Evaluated conditional (network_provider == "initscripts"): False 11762 1726853303.46079: when evaluation is False, skipping this task 11762 1726853303.46082: _execute() done 11762 1726853303.46086: dumping result to json 11762 1726853303.46089: done dumping result, returning 11762 1726853303.46091: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-d845-03d0-000000000a3c] 11762 1726853303.46093: sending task result for task 02083763-bbaf-d845-03d0-000000000a3c 11762 1726853303.46167: done sending task result for task 02083763-bbaf-d845-03d0-000000000a3c 11762 1726853303.46172: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853303.46417: no more pending results, returning what we have 11762 1726853303.46421: results queue empty 11762 1726853303.46422: checking for any_errors_fatal 11762 1726853303.46430: done checking for any_errors_fatal 11762 1726853303.46431: checking for max_fail_percentage 11762 1726853303.46433: done checking for max_fail_percentage 11762 1726853303.46434: checking to see if all hosts have failed and the running result is not ok 11762 1726853303.46435: done checking to see if all hosts have failed 11762 1726853303.46436: getting the remaining hosts for this loop 11762 1726853303.46438: done getting the remaining hosts for this loop 11762 1726853303.46441: getting the next task for host managed_node2 11762 1726853303.46449: done getting next task for host managed_node2 11762 1726853303.46453: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11762 1726853303.46458: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853303.46485: getting variables 11762 1726853303.46487: in VariableManager get_vars() 11762 1726853303.46533: Calling all_inventory to load vars for managed_node2 11762 1726853303.46536: Calling groups_inventory to load vars for managed_node2 11762 1726853303.46539: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853303.46549: Calling all_plugins_play to load vars for managed_node2 11762 1726853303.46552: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853303.46555: Calling groups_plugins_play to load vars for managed_node2 11762 1726853303.49524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853303.52461: done with get_vars() 11762 1726853303.52701: done getting variables 11762 1726853303.52762: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:28:23 -0400 (0:00:00.092) 0:00:53.960 ****** 11762 1726853303.53007: entering _queue_task() for managed_node2/copy 11762 1726853303.53447: worker is 1 (out of 1 available) 11762 1726853303.53462: exiting _queue_task() for managed_node2/copy 11762 1726853303.53590: done queuing things up, now waiting for results queue to drain 11762 1726853303.53593: waiting for pending results... 11762 1726853303.53798: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11762 1726853303.54015: in run() - task 02083763-bbaf-d845-03d0-000000000a3d 11762 1726853303.54044: variable 'ansible_search_path' from source: unknown 11762 1726853303.54058: variable 'ansible_search_path' from source: unknown 11762 1726853303.54134: calling self._execute() 11762 1726853303.54256: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853303.54259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853303.54262: variable 'omit' from source: magic vars 11762 1726853303.54675: variable 'ansible_distribution_major_version' from source: facts 11762 1726853303.54699: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853303.54976: variable 'network_provider' from source: set_fact 11762 1726853303.54980: Evaluated conditional (network_provider == "initscripts"): False 11762 1726853303.54983: when evaluation is False, skipping this task 11762 1726853303.54986: _execute() done 11762 1726853303.54988: dumping result to json 11762 1726853303.54990: done dumping result, returning 11762 1726853303.54994: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-d845-03d0-000000000a3d] 11762 1726853303.54996: sending task result for task 02083763-bbaf-d845-03d0-000000000a3d 11762 1726853303.55075: done sending task result for task 02083763-bbaf-d845-03d0-000000000a3d 11762 1726853303.55078: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11762 1726853303.55133: no more pending results, returning what we have 11762 1726853303.55138: results queue empty 11762 1726853303.55140: checking for any_errors_fatal 11762 1726853303.55145: done checking for any_errors_fatal 11762 1726853303.55146: checking for max_fail_percentage 11762 1726853303.55149: done checking for max_fail_percentage 11762 1726853303.55150: checking to see if all hosts have failed and the running result is not ok 11762 1726853303.55151: done checking to see if all hosts have failed 11762 1726853303.55151: getting the remaining hosts for this loop 11762 1726853303.55154: done getting the remaining hosts for this loop 11762 1726853303.55157: getting the next task for host managed_node2 11762 1726853303.55167: done getting next task for host managed_node2 11762 1726853303.55173: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11762 1726853303.55179: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853303.55207: getting variables 11762 1726853303.55209: in VariableManager get_vars() 11762 1726853303.55255: Calling all_inventory to load vars for managed_node2 11762 1726853303.55258: Calling groups_inventory to load vars for managed_node2 11762 1726853303.55261: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853303.55361: Calling all_plugins_play to load vars for managed_node2 11762 1726853303.55365: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853303.55369: Calling groups_plugins_play to load vars for managed_node2 11762 1726853303.57006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853303.59848: done with get_vars() 11762 1726853303.59877: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:28:23 -0400 (0:00:00.069) 0:00:54.030 ****** 11762 1726853303.59966: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11762 1726853303.60711: worker is 1 (out of 1 available) 11762 1726853303.60725: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11762 1726853303.60738: done queuing things up, now waiting for results queue to drain 11762 1726853303.60739: waiting for pending results... 11762 1726853303.61394: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11762 1726853303.61601: in run() - task 02083763-bbaf-d845-03d0-000000000a3e 11762 1726853303.61618: variable 'ansible_search_path' from source: unknown 11762 1726853303.61652: variable 'ansible_search_path' from source: unknown 11762 1726853303.61732: calling self._execute() 11762 1726853303.61923: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853303.61927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853303.61930: variable 'omit' from source: magic vars 11762 1726853303.62234: variable 'ansible_distribution_major_version' from source: facts 11762 1726853303.62248: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853303.62254: variable 'omit' from source: magic vars 11762 1726853303.62329: variable 'omit' from source: magic vars 11762 1726853303.62502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853303.66139: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853303.66222: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853303.66260: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853303.66293: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853303.66317: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853303.66403: variable 'network_provider' from source: set_fact 11762 1726853303.66543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853303.66574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853303.66599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853303.66638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853303.66660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853303.66731: variable 'omit' from source: magic vars 11762 1726853303.66840: variable 'omit' from source: magic vars 11762 1726853303.66943: variable 'network_connections' from source: task vars 11762 1726853303.66961: variable 'controller_profile' from source: play vars 11762 1726853303.67028: variable 'controller_profile' from source: play vars 11762 1726853303.67038: variable 'controller_device' from source: play vars 11762 1726853303.67111: variable 'controller_device' from source: play vars 11762 1726853303.67116: variable 'dhcp_interface1' from source: play vars 11762 1726853303.67219: variable 'dhcp_interface1' from source: play vars 11762 1726853303.67223: variable 'port1_profile' from source: play vars 11762 1726853303.67250: variable 'port1_profile' from source: play vars 11762 1726853303.67257: variable 'dhcp_interface1' from source: play vars 11762 1726853303.67319: variable 'dhcp_interface1' from source: play vars 11762 1726853303.67328: variable 'controller_profile' from source: play vars 11762 1726853303.67385: variable 'controller_profile' from source: play vars 11762 1726853303.67438: variable 'port2_profile' from source: play vars 11762 1726853303.67455: variable 'port2_profile' from source: play vars 11762 1726853303.67462: variable 'dhcp_interface2' from source: play vars 11762 1726853303.67561: variable 'dhcp_interface2' from source: play vars 11762 1726853303.67567: variable 'controller_profile' from source: play vars 11762 1726853303.67824: variable 'controller_profile' from source: play vars 11762 1726853303.68230: variable 'omit' from source: magic vars 11762 1726853303.68238: variable '__lsr_ansible_managed' from source: task vars 11762 1726853303.68411: variable '__lsr_ansible_managed' from source: task vars 11762 1726853303.68740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11762 1726853303.69108: Loaded config def from plugin (lookup/template) 11762 1726853303.69228: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11762 1726853303.69277: File lookup term: get_ansible_managed.j2 11762 1726853303.69280: variable 'ansible_search_path' from source: unknown 11762 1726853303.69284: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11762 1726853303.69333: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11762 1726853303.69386: variable 'ansible_search_path' from source: unknown 11762 1726853303.91893: variable 'ansible_managed' from source: unknown 11762 1726853303.92032: variable 'omit' from source: magic vars 11762 1726853303.92063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853303.92195: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853303.92198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853303.92201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853303.92203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853303.92205: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853303.92207: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853303.92209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853303.92268: Set connection var ansible_timeout to 10 11762 1726853303.92279: Set connection var ansible_shell_type to sh 11762 1726853303.92289: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853303.92311: Set connection var ansible_shell_executable to /bin/sh 11762 1726853303.92322: Set connection var ansible_pipelining to False 11762 1726853303.92332: Set connection var ansible_connection to ssh 11762 1726853303.92357: variable 'ansible_shell_executable' from source: unknown 11762 1726853303.92364: variable 'ansible_connection' from source: unknown 11762 1726853303.92370: variable 'ansible_module_compression' from source: unknown 11762 1726853303.92379: variable 'ansible_shell_type' from source: unknown 11762 1726853303.92386: variable 'ansible_shell_executable' from source: unknown 11762 1726853303.92392: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853303.92399: variable 'ansible_pipelining' from source: unknown 11762 1726853303.92408: variable 'ansible_timeout' from source: unknown 11762 1726853303.92418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853303.92542: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853303.92557: variable 'omit' from source: magic vars 11762 1726853303.92567: starting attempt loop 11762 1726853303.92632: running the handler 11762 1726853303.92635: _low_level_execute_command(): starting 11762 1726853303.92637: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853303.93297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853303.93317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853303.93390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853303.93410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853303.93521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853303.93620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853303.95362: stdout chunk (state=3): >>>/root <<< 11762 1726853303.95514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853303.95524: stdout chunk (state=3): >>><<< 11762 1726853303.95536: stderr chunk (state=3): >>><<< 11762 1726853303.95587: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853303.95606: _low_level_execute_command(): starting 11762 1726853303.95854: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180 `" && echo ansible-tmp-1726853303.9559386-14332-62095419927180="` echo /root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180 `" ) && sleep 0' 11762 1726853303.96962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853303.97081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853303.97172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853303.97340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853303.97497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853303.99529: stdout chunk (state=3): >>>ansible-tmp-1726853303.9559386-14332-62095419927180=/root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180 <<< 11762 1726853303.99640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853303.99681: stderr chunk (state=3): >>><<< 11762 1726853303.99894: stdout chunk (state=3): >>><<< 11762 1726853303.99897: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853303.9559386-14332-62095419927180=/root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853303.99904: variable 'ansible_module_compression' from source: unknown 11762 1726853304.00076: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11762 1726853304.00079: variable 'ansible_facts' from source: unknown 11762 1726853304.00330: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180/AnsiballZ_network_connections.py 11762 1726853304.00561: Sending initial data 11762 1726853304.00570: Sent initial data (167 bytes) 11762 1726853304.01631: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853304.01787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853304.01845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853304.01864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853304.01894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853304.02085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853304.03792: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853304.03844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853304.03935: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpotifm4z9 /root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180/AnsiballZ_network_connections.py <<< 11762 1726853304.03939: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180/AnsiballZ_network_connections.py" <<< 11762 1726853304.04035: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpotifm4z9" to remote "/root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180/AnsiballZ_network_connections.py" <<< 11762 1726853304.06001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853304.06094: stderr chunk (state=3): >>><<< 11762 1726853304.06100: stdout chunk (state=3): >>><<< 11762 1726853304.06167: done transferring module to remote 11762 1726853304.06209: _low_level_execute_command(): starting 11762 1726853304.06212: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180/ /root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180/AnsiballZ_network_connections.py && sleep 0' 11762 1726853304.06789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853304.06799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853304.06876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853304.06879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853304.06884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853304.06887: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853304.06889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853304.06891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853304.06893: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853304.06895: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853304.06992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853304.06995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853304.06998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853304.07030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853304.07125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853304.09094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853304.09098: stdout chunk (state=3): >>><<< 11762 1726853304.09100: stderr chunk (state=3): >>><<< 11762 1726853304.09145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853304.09172: _low_level_execute_command(): starting 11762 1726853304.09175: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180/AnsiballZ_network_connections.py && sleep 0' 11762 1726853304.09844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853304.09848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853304.09851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853304.09854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853304.09856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853304.09858: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853304.09861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853304.09863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853304.09865: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853304.09867: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853304.09868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853304.09870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853304.09874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853304.09876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853304.09878: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853304.09880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853304.09933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853304.09954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853304.09975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853304.10079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853304.57379: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11762 1726853304.59526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853304.59545: stdout chunk (state=3): >>><<< 11762 1726853304.59577: stderr chunk (state=3): >>><<< 11762 1726853304.59603: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853304.59693: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'arp_interval': 60, 'arp_ip_target': '192.0.2.128', 'arp_validate': 'none', 'primary': 'test1'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853304.59714: _low_level_execute_command(): starting 11762 1726853304.59729: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853303.9559386-14332-62095419927180/ > /dev/null 2>&1 && sleep 0' 11762 1726853304.60442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853304.60458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853304.60550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853304.60565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853304.60593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853304.60617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853304.60635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853304.60811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853304.62776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853304.62780: stdout chunk (state=3): >>><<< 11762 1726853304.62782: stderr chunk (state=3): >>><<< 11762 1726853304.62785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853304.62787: handler run complete 11762 1726853304.62792: attempt loop complete, returning result 11762 1726853304.62794: _execute() done 11762 1726853304.62796: dumping result to json 11762 1726853304.62805: done dumping result, returning 11762 1726853304.62813: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-d845-03d0-000000000a3e] 11762 1726853304.62816: sending task result for task 02083763-bbaf-d845-03d0-000000000a3e changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df (not-active) 11762 1726853304.63191: no more pending results, returning what we have 11762 1726853304.63194: results queue empty 11762 1726853304.63195: checking for any_errors_fatal 11762 1726853304.63202: done checking for any_errors_fatal 11762 1726853304.63203: checking for max_fail_percentage 11762 1726853304.63204: done checking for max_fail_percentage 11762 1726853304.63205: checking to see if all hosts have failed and the running result is not ok 11762 1726853304.63206: done checking to see if all hosts have failed 11762 1726853304.63207: getting the remaining hosts for this loop 11762 1726853304.63208: done getting the remaining hosts for this loop 11762 1726853304.63212: getting the next task for host managed_node2 11762 1726853304.63218: done getting next task for host managed_node2 11762 1726853304.63225: done sending task result for task 02083763-bbaf-d845-03d0-000000000a3e 11762 1726853304.63370: WORKER PROCESS EXITING 11762 1726853304.63363: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11762 1726853304.63379: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853304.63392: getting variables 11762 1726853304.63394: in VariableManager get_vars() 11762 1726853304.63435: Calling all_inventory to load vars for managed_node2 11762 1726853304.63437: Calling groups_inventory to load vars for managed_node2 11762 1726853304.63439: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853304.63451: Calling all_plugins_play to load vars for managed_node2 11762 1726853304.63454: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853304.63457: Calling groups_plugins_play to load vars for managed_node2 11762 1726853304.65450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853304.67041: done with get_vars() 11762 1726853304.67066: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:28:24 -0400 (0:00:01.071) 0:00:55.101 ****** 11762 1726853304.67155: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11762 1726853304.67490: worker is 1 (out of 1 available) 11762 1726853304.67503: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11762 1726853304.67515: done queuing things up, now waiting for results queue to drain 11762 1726853304.67517: waiting for pending results... 11762 1726853304.67811: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 11762 1726853304.67973: in run() - task 02083763-bbaf-d845-03d0-000000000a3f 11762 1726853304.67997: variable 'ansible_search_path' from source: unknown 11762 1726853304.68004: variable 'ansible_search_path' from source: unknown 11762 1726853304.68044: calling self._execute() 11762 1726853304.68150: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853304.68211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853304.68215: variable 'omit' from source: magic vars 11762 1726853304.68565: variable 'ansible_distribution_major_version' from source: facts 11762 1726853304.68586: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853304.68722: variable 'network_state' from source: role '' defaults 11762 1726853304.68739: Evaluated conditional (network_state != {}): False 11762 1726853304.68753: when evaluation is False, skipping this task 11762 1726853304.68764: _execute() done 11762 1726853304.68974: dumping result to json 11762 1726853304.68978: done dumping result, returning 11762 1726853304.68981: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-d845-03d0-000000000a3f] 11762 1726853304.68984: sending task result for task 02083763-bbaf-d845-03d0-000000000a3f 11762 1726853304.69064: done sending task result for task 02083763-bbaf-d845-03d0-000000000a3f 11762 1726853304.69068: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853304.69131: no more pending results, returning what we have 11762 1726853304.69136: results queue empty 11762 1726853304.69137: checking for any_errors_fatal 11762 1726853304.69155: done checking for any_errors_fatal 11762 1726853304.69157: checking for max_fail_percentage 11762 1726853304.69159: done checking for max_fail_percentage 11762 1726853304.69160: checking to see if all hosts have failed and the running result is not ok 11762 1726853304.69161: done checking to see if all hosts have failed 11762 1726853304.69161: getting the remaining hosts for this loop 11762 1726853304.69163: done getting the remaining hosts for this loop 11762 1726853304.69167: getting the next task for host managed_node2 11762 1726853304.69177: done getting next task for host managed_node2 11762 1726853304.69181: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11762 1726853304.69187: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853304.69212: getting variables 11762 1726853304.69214: in VariableManager get_vars() 11762 1726853304.69264: Calling all_inventory to load vars for managed_node2 11762 1726853304.69267: Calling groups_inventory to load vars for managed_node2 11762 1726853304.69270: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853304.69456: Calling all_plugins_play to load vars for managed_node2 11762 1726853304.69459: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853304.69462: Calling groups_plugins_play to load vars for managed_node2 11762 1726853304.70746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853304.77474: done with get_vars() 11762 1726853304.77498: done getting variables 11762 1726853304.77548: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:28:24 -0400 (0:00:00.106) 0:00:55.208 ****** 11762 1726853304.77785: entering _queue_task() for managed_node2/debug 11762 1726853304.78496: worker is 1 (out of 1 available) 11762 1726853304.78509: exiting _queue_task() for managed_node2/debug 11762 1726853304.78524: done queuing things up, now waiting for results queue to drain 11762 1726853304.78526: waiting for pending results... 11762 1726853304.78950: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11762 1726853304.79140: in run() - task 02083763-bbaf-d845-03d0-000000000a40 11762 1726853304.79177: variable 'ansible_search_path' from source: unknown 11762 1726853304.79182: variable 'ansible_search_path' from source: unknown 11762 1726853304.79376: calling self._execute() 11762 1726853304.79380: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853304.79383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853304.79386: variable 'omit' from source: magic vars 11762 1726853304.79729: variable 'ansible_distribution_major_version' from source: facts 11762 1726853304.79749: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853304.79761: variable 'omit' from source: magic vars 11762 1726853304.79845: variable 'omit' from source: magic vars 11762 1726853304.79886: variable 'omit' from source: magic vars 11762 1726853304.79932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853304.79979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853304.80007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853304.80030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853304.80054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853304.80091: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853304.80100: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853304.80109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853304.80223: Set connection var ansible_timeout to 10 11762 1726853304.80232: Set connection var ansible_shell_type to sh 11762 1726853304.80267: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853304.80272: Set connection var ansible_shell_executable to /bin/sh 11762 1726853304.80275: Set connection var ansible_pipelining to False 11762 1726853304.80279: Set connection var ansible_connection to ssh 11762 1726853304.80305: variable 'ansible_shell_executable' from source: unknown 11762 1726853304.80313: variable 'ansible_connection' from source: unknown 11762 1726853304.80377: variable 'ansible_module_compression' from source: unknown 11762 1726853304.80380: variable 'ansible_shell_type' from source: unknown 11762 1726853304.80383: variable 'ansible_shell_executable' from source: unknown 11762 1726853304.80385: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853304.80388: variable 'ansible_pipelining' from source: unknown 11762 1726853304.80391: variable 'ansible_timeout' from source: unknown 11762 1726853304.80394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853304.80506: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853304.80525: variable 'omit' from source: magic vars 11762 1726853304.80535: starting attempt loop 11762 1726853304.80541: running the handler 11762 1726853304.80676: variable '__network_connections_result' from source: set_fact 11762 1726853304.81176: handler run complete 11762 1726853304.81180: attempt loop complete, returning result 11762 1726853304.81182: _execute() done 11762 1726853304.81185: dumping result to json 11762 1726853304.81187: done dumping result, returning 11762 1726853304.81190: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-d845-03d0-000000000a40] 11762 1726853304.81192: sending task result for task 02083763-bbaf-d845-03d0-000000000a40 11762 1726853304.81255: done sending task result for task 02083763-bbaf-d845-03d0-000000000a40 11762 1726853304.81259: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df (not-active)" ] } 11762 1726853304.81340: no more pending results, returning what we have 11762 1726853304.81347: results queue empty 11762 1726853304.81348: checking for any_errors_fatal 11762 1726853304.81356: done checking for any_errors_fatal 11762 1726853304.81357: checking for max_fail_percentage 11762 1726853304.81358: done checking for max_fail_percentage 11762 1726853304.81360: checking to see if all hosts have failed and the running result is not ok 11762 1726853304.81360: done checking to see if all hosts have failed 11762 1726853304.81361: getting the remaining hosts for this loop 11762 1726853304.81364: done getting the remaining hosts for this loop 11762 1726853304.81366: getting the next task for host managed_node2 11762 1726853304.81376: done getting next task for host managed_node2 11762 1726853304.81379: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11762 1726853304.81383: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853304.81394: getting variables 11762 1726853304.81396: in VariableManager get_vars() 11762 1726853304.81435: Calling all_inventory to load vars for managed_node2 11762 1726853304.81438: Calling groups_inventory to load vars for managed_node2 11762 1726853304.81441: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853304.81454: Calling all_plugins_play to load vars for managed_node2 11762 1726853304.81464: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853304.81467: Calling groups_plugins_play to load vars for managed_node2 11762 1726853304.84645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853304.88405: done with get_vars() 11762 1726853304.88438: done getting variables 11762 1726853304.88505: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:28:24 -0400 (0:00:00.107) 0:00:55.315 ****** 11762 1726853304.88548: entering _queue_task() for managed_node2/debug 11762 1726853304.89312: worker is 1 (out of 1 available) 11762 1726853304.89328: exiting _queue_task() for managed_node2/debug 11762 1726853304.89345: done queuing things up, now waiting for results queue to drain 11762 1726853304.89347: waiting for pending results... 11762 1726853304.90214: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11762 1726853304.90259: in run() - task 02083763-bbaf-d845-03d0-000000000a41 11762 1726853304.90279: variable 'ansible_search_path' from source: unknown 11762 1726853304.90283: variable 'ansible_search_path' from source: unknown 11762 1726853304.90321: calling self._execute() 11762 1726853304.90653: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853304.90656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853304.90659: variable 'omit' from source: magic vars 11762 1726853304.91043: variable 'ansible_distribution_major_version' from source: facts 11762 1726853304.91059: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853304.91074: variable 'omit' from source: magic vars 11762 1726853304.91277: variable 'omit' from source: magic vars 11762 1726853304.91282: variable 'omit' from source: magic vars 11762 1726853304.91286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853304.91289: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853304.91291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853304.91314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853304.91319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853304.91358: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853304.91361: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853304.91364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853304.91469: Set connection var ansible_timeout to 10 11762 1726853304.91474: Set connection var ansible_shell_type to sh 11762 1726853304.91480: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853304.91485: Set connection var ansible_shell_executable to /bin/sh 11762 1726853304.91494: Set connection var ansible_pipelining to False 11762 1726853304.91503: Set connection var ansible_connection to ssh 11762 1726853304.91523: variable 'ansible_shell_executable' from source: unknown 11762 1726853304.91526: variable 'ansible_connection' from source: unknown 11762 1726853304.91529: variable 'ansible_module_compression' from source: unknown 11762 1726853304.91532: variable 'ansible_shell_type' from source: unknown 11762 1726853304.91534: variable 'ansible_shell_executable' from source: unknown 11762 1726853304.91536: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853304.91539: variable 'ansible_pipelining' from source: unknown 11762 1726853304.91541: variable 'ansible_timeout' from source: unknown 11762 1726853304.91556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853304.91779: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853304.91783: variable 'omit' from source: magic vars 11762 1726853304.91786: starting attempt loop 11762 1726853304.91788: running the handler 11762 1726853304.91790: variable '__network_connections_result' from source: set_fact 11762 1726853304.91844: variable '__network_connections_result' from source: set_fact 11762 1726853304.92172: handler run complete 11762 1726853304.92175: attempt loop complete, returning result 11762 1726853304.92177: _execute() done 11762 1726853304.92179: dumping result to json 11762 1726853304.92180: done dumping result, returning 11762 1726853304.92209: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-d845-03d0-000000000a41] 11762 1726853304.92212: sending task result for task 02083763-bbaf-d845-03d0-000000000a41 11762 1726853304.92285: done sending task result for task 02083763-bbaf-d845-03d0-000000000a41 11762 1726853304.92288: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df (not-active)" ] } } 11762 1726853304.92413: no more pending results, returning what we have 11762 1726853304.92418: results queue empty 11762 1726853304.92419: checking for any_errors_fatal 11762 1726853304.92425: done checking for any_errors_fatal 11762 1726853304.92426: checking for max_fail_percentage 11762 1726853304.92428: done checking for max_fail_percentage 11762 1726853304.92429: checking to see if all hosts have failed and the running result is not ok 11762 1726853304.92430: done checking to see if all hosts have failed 11762 1726853304.92431: getting the remaining hosts for this loop 11762 1726853304.92433: done getting the remaining hosts for this loop 11762 1726853304.92436: getting the next task for host managed_node2 11762 1726853304.92446: done getting next task for host managed_node2 11762 1726853304.92450: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11762 1726853304.92456: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853304.92469: getting variables 11762 1726853304.92473: in VariableManager get_vars() 11762 1726853304.92517: Calling all_inventory to load vars for managed_node2 11762 1726853304.92520: Calling groups_inventory to load vars for managed_node2 11762 1726853304.92523: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853304.92533: Calling all_plugins_play to load vars for managed_node2 11762 1726853304.92537: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853304.92540: Calling groups_plugins_play to load vars for managed_node2 11762 1726853304.94899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853304.97018: done with get_vars() 11762 1726853304.97045: done getting variables 11762 1726853304.97104: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:28:24 -0400 (0:00:00.085) 0:00:55.401 ****** 11762 1726853304.97140: entering _queue_task() for managed_node2/debug 11762 1726853304.97686: worker is 1 (out of 1 available) 11762 1726853304.97700: exiting _queue_task() for managed_node2/debug 11762 1726853304.97876: done queuing things up, now waiting for results queue to drain 11762 1726853304.97878: waiting for pending results... 11762 1726853304.98127: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11762 1726853304.98213: in run() - task 02083763-bbaf-d845-03d0-000000000a42 11762 1726853304.98234: variable 'ansible_search_path' from source: unknown 11762 1726853304.98238: variable 'ansible_search_path' from source: unknown 11762 1726853304.98279: calling self._execute() 11762 1726853304.98390: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853304.98403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853304.98408: variable 'omit' from source: magic vars 11762 1726853304.98813: variable 'ansible_distribution_major_version' from source: facts 11762 1726853304.98824: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853304.98948: variable 'network_state' from source: role '' defaults 11762 1726853304.98985: Evaluated conditional (network_state != {}): False 11762 1726853304.98988: when evaluation is False, skipping this task 11762 1726853304.98991: _execute() done 11762 1726853304.98994: dumping result to json 11762 1726853304.98996: done dumping result, returning 11762 1726853304.98998: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-d845-03d0-000000000a42] 11762 1726853304.99000: sending task result for task 02083763-bbaf-d845-03d0-000000000a42 11762 1726853304.99210: done sending task result for task 02083763-bbaf-d845-03d0-000000000a42 11762 1726853304.99214: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 11762 1726853304.99261: no more pending results, returning what we have 11762 1726853304.99265: results queue empty 11762 1726853304.99266: checking for any_errors_fatal 11762 1726853304.99274: done checking for any_errors_fatal 11762 1726853304.99275: checking for max_fail_percentage 11762 1726853304.99277: done checking for max_fail_percentage 11762 1726853304.99278: checking to see if all hosts have failed and the running result is not ok 11762 1726853304.99278: done checking to see if all hosts have failed 11762 1726853304.99279: getting the remaining hosts for this loop 11762 1726853304.99281: done getting the remaining hosts for this loop 11762 1726853304.99284: getting the next task for host managed_node2 11762 1726853304.99291: done getting next task for host managed_node2 11762 1726853304.99295: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11762 1726853304.99300: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853304.99318: getting variables 11762 1726853304.99320: in VariableManager get_vars() 11762 1726853304.99362: Calling all_inventory to load vars for managed_node2 11762 1726853304.99365: Calling groups_inventory to load vars for managed_node2 11762 1726853304.99368: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853304.99496: Calling all_plugins_play to load vars for managed_node2 11762 1726853304.99500: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853304.99503: Calling groups_plugins_play to load vars for managed_node2 11762 1726853305.02383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853305.05640: done with get_vars() 11762 1726853305.05877: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:28:25 -0400 (0:00:00.088) 0:00:55.490 ****** 11762 1726853305.05984: entering _queue_task() for managed_node2/ping 11762 1726853305.06537: worker is 1 (out of 1 available) 11762 1726853305.06552: exiting _queue_task() for managed_node2/ping 11762 1726853305.06566: done queuing things up, now waiting for results queue to drain 11762 1726853305.06568: waiting for pending results... 11762 1726853305.07058: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 11762 1726853305.07532: in run() - task 02083763-bbaf-d845-03d0-000000000a43 11762 1726853305.07538: variable 'ansible_search_path' from source: unknown 11762 1726853305.07542: variable 'ansible_search_path' from source: unknown 11762 1726853305.07573: calling self._execute() 11762 1726853305.07737: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853305.07966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853305.07970: variable 'omit' from source: magic vars 11762 1726853305.08590: variable 'ansible_distribution_major_version' from source: facts 11762 1726853305.08736: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853305.08749: variable 'omit' from source: magic vars 11762 1726853305.08879: variable 'omit' from source: magic vars 11762 1726853305.08918: variable 'omit' from source: magic vars 11762 1726853305.09184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853305.09228: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853305.09478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853305.09481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853305.09484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853305.09488: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853305.09490: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853305.09493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853305.09496: Set connection var ansible_timeout to 10 11762 1726853305.09499: Set connection var ansible_shell_type to sh 11762 1726853305.09501: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853305.09504: Set connection var ansible_shell_executable to /bin/sh 11762 1726853305.09506: Set connection var ansible_pipelining to False 11762 1726853305.09509: Set connection var ansible_connection to ssh 11762 1726853305.09511: variable 'ansible_shell_executable' from source: unknown 11762 1726853305.09513: variable 'ansible_connection' from source: unknown 11762 1726853305.09516: variable 'ansible_module_compression' from source: unknown 11762 1726853305.09523: variable 'ansible_shell_type' from source: unknown 11762 1726853305.09531: variable 'ansible_shell_executable' from source: unknown 11762 1726853305.09537: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853305.09544: variable 'ansible_pipelining' from source: unknown 11762 1726853305.09551: variable 'ansible_timeout' from source: unknown 11762 1726853305.09558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853305.09776: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853305.09798: variable 'omit' from source: magic vars 11762 1726853305.09812: starting attempt loop 11762 1726853305.09818: running the handler 11762 1726853305.09836: _low_level_execute_command(): starting 11762 1726853305.09919: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853305.10698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.10718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853305.10747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853305.10762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853305.10919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853305.12662: stdout chunk (state=3): >>>/root <<< 11762 1726853305.12863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853305.12867: stdout chunk (state=3): >>><<< 11762 1726853305.12869: stderr chunk (state=3): >>><<< 11762 1726853305.12894: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853305.13077: _low_level_execute_command(): starting 11762 1726853305.13082: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106 `" && echo ansible-tmp-1726853305.1297803-14393-164359025595106="` echo /root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106 `" ) && sleep 0' 11762 1726853305.14210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853305.14224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.14238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.14420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853305.14502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853305.14584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853305.16694: stdout chunk (state=3): >>>ansible-tmp-1726853305.1297803-14393-164359025595106=/root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106 <<< 11762 1726853305.16712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853305.16723: stderr chunk (state=3): >>><<< 11762 1726853305.16731: stdout chunk (state=3): >>><<< 11762 1726853305.16986: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853305.1297803-14393-164359025595106=/root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853305.16989: variable 'ansible_module_compression' from source: unknown 11762 1726853305.17176: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11762 1726853305.17180: variable 'ansible_facts' from source: unknown 11762 1726853305.17430: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106/AnsiballZ_ping.py 11762 1726853305.17796: Sending initial data 11762 1726853305.17799: Sent initial data (153 bytes) 11762 1726853305.19159: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853305.19163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853305.19220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853305.19227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.19444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853305.19486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853305.21185: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853305.21252: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853305.21341: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp3amcz2vp /root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106/AnsiballZ_ping.py <<< 11762 1726853305.21344: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106/AnsiballZ_ping.py" <<< 11762 1726853305.21498: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp3amcz2vp" to remote "/root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106/AnsiballZ_ping.py" <<< 11762 1726853305.22805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853305.22829: stderr chunk (state=3): >>><<< 11762 1726853305.22832: stdout chunk (state=3): >>><<< 11762 1726853305.22879: done transferring module to remote 11762 1726853305.22882: _low_level_execute_command(): starting 11762 1726853305.22884: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106/ /root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106/AnsiballZ_ping.py && sleep 0' 11762 1726853305.23868: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853305.23873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853305.24086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853305.24149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853305.26212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853305.26287: stderr chunk (state=3): >>><<< 11762 1726853305.26291: stdout chunk (state=3): >>><<< 11762 1726853305.26308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853305.26318: _low_level_execute_command(): starting 11762 1726853305.26320: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106/AnsiballZ_ping.py && sleep 0' 11762 1726853305.27797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853305.27813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853305.27829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853305.27882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853305.27895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.28076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853305.28091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853305.28233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853305.43906: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}}<<< 11762 1726853305.43961: stdout chunk (state=3): >>> <<< 11762 1726853305.45337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853305.45376: stderr chunk (state=3): >>><<< 11762 1726853305.45379: stdout chunk (state=3): >>><<< 11762 1726853305.45399: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853305.45436: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853305.45494: _low_level_execute_command(): starting 11762 1726853305.45497: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853305.1297803-14393-164359025595106/ > /dev/null 2>&1 && sleep 0' 11762 1726853305.46160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853305.46164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853305.46167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.46170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853305.46175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.46177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853305.46179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853305.46218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853305.46296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853305.48177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853305.48194: stderr chunk (state=3): >>><<< 11762 1726853305.48197: stdout chunk (state=3): >>><<< 11762 1726853305.48212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853305.48221: handler run complete 11762 1726853305.48233: attempt loop complete, returning result 11762 1726853305.48236: _execute() done 11762 1726853305.48238: dumping result to json 11762 1726853305.48240: done dumping result, returning 11762 1726853305.48249: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-d845-03d0-000000000a43] 11762 1726853305.48255: sending task result for task 02083763-bbaf-d845-03d0-000000000a43 11762 1726853305.48341: done sending task result for task 02083763-bbaf-d845-03d0-000000000a43 11762 1726853305.48347: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 11762 1726853305.48417: no more pending results, returning what we have 11762 1726853305.48421: results queue empty 11762 1726853305.48422: checking for any_errors_fatal 11762 1726853305.48428: done checking for any_errors_fatal 11762 1726853305.48428: checking for max_fail_percentage 11762 1726853305.48430: done checking for max_fail_percentage 11762 1726853305.48431: checking to see if all hosts have failed and the running result is not ok 11762 1726853305.48432: done checking to see if all hosts have failed 11762 1726853305.48432: getting the remaining hosts for this loop 11762 1726853305.48434: done getting the remaining hosts for this loop 11762 1726853305.48437: getting the next task for host managed_node2 11762 1726853305.48450: done getting next task for host managed_node2 11762 1726853305.48452: ^ task is: TASK: meta (role_complete) 11762 1726853305.48457: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853305.48469: getting variables 11762 1726853305.48472: in VariableManager get_vars() 11762 1726853305.48524: Calling all_inventory to load vars for managed_node2 11762 1726853305.48527: Calling groups_inventory to load vars for managed_node2 11762 1726853305.48529: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853305.48538: Calling all_plugins_play to load vars for managed_node2 11762 1726853305.48540: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853305.48542: Calling groups_plugins_play to load vars for managed_node2 11762 1726853305.50081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853305.50952: done with get_vars() 11762 1726853305.50969: done getting variables 11762 1726853305.51030: done queuing things up, now waiting for results queue to drain 11762 1726853305.51031: results queue empty 11762 1726853305.51032: checking for any_errors_fatal 11762 1726853305.51033: done checking for any_errors_fatal 11762 1726853305.51034: checking for max_fail_percentage 11762 1726853305.51034: done checking for max_fail_percentage 11762 1726853305.51035: checking to see if all hosts have failed and the running result is not ok 11762 1726853305.51035: done checking to see if all hosts have failed 11762 1726853305.51036: getting the remaining hosts for this loop 11762 1726853305.51036: done getting the remaining hosts for this loop 11762 1726853305.51038: getting the next task for host managed_node2 11762 1726853305.51041: done getting next task for host managed_node2 11762 1726853305.51043: ^ task is: TASK: Show result 11762 1726853305.51045: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853305.51047: getting variables 11762 1726853305.51047: in VariableManager get_vars() 11762 1726853305.51057: Calling all_inventory to load vars for managed_node2 11762 1726853305.51058: Calling groups_inventory to load vars for managed_node2 11762 1726853305.51059: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853305.51063: Calling all_plugins_play to load vars for managed_node2 11762 1726853305.51065: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853305.51067: Calling groups_plugins_play to load vars for managed_node2 11762 1726853305.51756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853305.53146: done with get_vars() 11762 1726853305.53175: done getting variables 11762 1726853305.53219: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml:33 Friday 20 September 2024 13:28:25 -0400 (0:00:00.472) 0:00:55.962 ****** 11762 1726853305.53249: entering _queue_task() for managed_node2/debug 11762 1726853305.53606: worker is 1 (out of 1 available) 11762 1726853305.53620: exiting _queue_task() for managed_node2/debug 11762 1726853305.53635: done queuing things up, now waiting for results queue to drain 11762 1726853305.53636: waiting for pending results... 11762 1726853305.53876: running TaskExecutor() for managed_node2/TASK: Show result 11762 1726853305.53954: in run() - task 02083763-bbaf-d845-03d0-000000000a73 11762 1726853305.53965: variable 'ansible_search_path' from source: unknown 11762 1726853305.53969: variable 'ansible_search_path' from source: unknown 11762 1726853305.54003: calling self._execute() 11762 1726853305.54081: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853305.54085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853305.54094: variable 'omit' from source: magic vars 11762 1726853305.54383: variable 'ansible_distribution_major_version' from source: facts 11762 1726853305.54392: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853305.54398: variable 'omit' from source: magic vars 11762 1726853305.54413: variable 'omit' from source: magic vars 11762 1726853305.54441: variable 'omit' from source: magic vars 11762 1726853305.54475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853305.54503: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853305.54521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853305.54539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853305.54548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853305.54568: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853305.54573: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853305.54575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853305.54645: Set connection var ansible_timeout to 10 11762 1726853305.54649: Set connection var ansible_shell_type to sh 11762 1726853305.54653: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853305.54655: Set connection var ansible_shell_executable to /bin/sh 11762 1726853305.54665: Set connection var ansible_pipelining to False 11762 1726853305.54668: Set connection var ansible_connection to ssh 11762 1726853305.54686: variable 'ansible_shell_executable' from source: unknown 11762 1726853305.54689: variable 'ansible_connection' from source: unknown 11762 1726853305.54693: variable 'ansible_module_compression' from source: unknown 11762 1726853305.54695: variable 'ansible_shell_type' from source: unknown 11762 1726853305.54697: variable 'ansible_shell_executable' from source: unknown 11762 1726853305.54700: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853305.54702: variable 'ansible_pipelining' from source: unknown 11762 1726853305.54706: variable 'ansible_timeout' from source: unknown 11762 1726853305.54710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853305.54814: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853305.54824: variable 'omit' from source: magic vars 11762 1726853305.54829: starting attempt loop 11762 1726853305.54832: running the handler 11762 1726853305.54870: variable '__network_connections_result' from source: set_fact 11762 1726853305.54927: variable '__network_connections_result' from source: set_fact 11762 1726853305.55049: handler run complete 11762 1726853305.55067: attempt loop complete, returning result 11762 1726853305.55072: _execute() done 11762 1726853305.55075: dumping result to json 11762 1726853305.55079: done dumping result, returning 11762 1726853305.55086: done running TaskExecutor() for managed_node2/TASK: Show result [02083763-bbaf-d845-03d0-000000000a73] 11762 1726853305.55092: sending task result for task 02083763-bbaf-d845-03d0-000000000a73 11762 1726853305.55185: done sending task result for task 02083763-bbaf-d845-03d0-000000000a73 11762 1726853305.55188: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 61017df5-d8c8-402f-9503-fd0fc150036f (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0632f651-903c-44ef-ab96-c625e73a569b (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5c7babf4-8462-428b-96e1-53b35f33a6df (not-active)" ] } } 11762 1726853305.55293: no more pending results, returning what we have 11762 1726853305.55296: results queue empty 11762 1726853305.55302: checking for any_errors_fatal 11762 1726853305.55304: done checking for any_errors_fatal 11762 1726853305.55305: checking for max_fail_percentage 11762 1726853305.55306: done checking for max_fail_percentage 11762 1726853305.55307: checking to see if all hosts have failed and the running result is not ok 11762 1726853305.55308: done checking to see if all hosts have failed 11762 1726853305.55308: getting the remaining hosts for this loop 11762 1726853305.55312: done getting the remaining hosts for this loop 11762 1726853305.55315: getting the next task for host managed_node2 11762 1726853305.55323: done getting next task for host managed_node2 11762 1726853305.55326: ^ task is: TASK: Asserts 11762 1726853305.55329: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853305.55333: getting variables 11762 1726853305.55334: in VariableManager get_vars() 11762 1726853305.55368: Calling all_inventory to load vars for managed_node2 11762 1726853305.55377: Calling groups_inventory to load vars for managed_node2 11762 1726853305.55380: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853305.55389: Calling all_plugins_play to load vars for managed_node2 11762 1726853305.55391: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853305.55394: Calling groups_plugins_play to load vars for managed_node2 11762 1726853305.56752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853305.57613: done with get_vars() 11762 1726853305.57628: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 13:28:25 -0400 (0:00:00.044) 0:00:56.007 ****** 11762 1726853305.57699: entering _queue_task() for managed_node2/include_tasks 11762 1726853305.57932: worker is 1 (out of 1 available) 11762 1726853305.57949: exiting _queue_task() for managed_node2/include_tasks 11762 1726853305.57962: done queuing things up, now waiting for results queue to drain 11762 1726853305.57964: waiting for pending results... 11762 1726853305.58148: running TaskExecutor() for managed_node2/TASK: Asserts 11762 1726853305.58225: in run() - task 02083763-bbaf-d845-03d0-0000000008ef 11762 1726853305.58236: variable 'ansible_search_path' from source: unknown 11762 1726853305.58240: variable 'ansible_search_path' from source: unknown 11762 1726853305.58276: variable 'lsr_assert' from source: include params 11762 1726853305.58446: variable 'lsr_assert' from source: include params 11762 1726853305.58495: variable 'omit' from source: magic vars 11762 1726853305.58595: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853305.58605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853305.58617: variable 'omit' from source: magic vars 11762 1726853305.58788: variable 'ansible_distribution_major_version' from source: facts 11762 1726853305.58796: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853305.58801: variable 'item' from source: unknown 11762 1726853305.58848: variable 'item' from source: unknown 11762 1726853305.58870: variable 'item' from source: unknown 11762 1726853305.58913: variable 'item' from source: unknown 11762 1726853305.59036: dumping result to json 11762 1726853305.59038: done dumping result, returning 11762 1726853305.59040: done running TaskExecutor() for managed_node2/TASK: Asserts [02083763-bbaf-d845-03d0-0000000008ef] 11762 1726853305.59045: sending task result for task 02083763-bbaf-d845-03d0-0000000008ef 11762 1726853305.59082: done sending task result for task 02083763-bbaf-d845-03d0-0000000008ef 11762 1726853305.59086: WORKER PROCESS EXITING 11762 1726853305.59106: no more pending results, returning what we have 11762 1726853305.59110: in VariableManager get_vars() 11762 1726853305.59154: Calling all_inventory to load vars for managed_node2 11762 1726853305.59157: Calling groups_inventory to load vars for managed_node2 11762 1726853305.59159: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853305.59168: Calling all_plugins_play to load vars for managed_node2 11762 1726853305.59173: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853305.59176: Calling groups_plugins_play to load vars for managed_node2 11762 1726853305.59925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853305.60900: done with get_vars() 11762 1726853305.60913: variable 'ansible_search_path' from source: unknown 11762 1726853305.60914: variable 'ansible_search_path' from source: unknown 11762 1726853305.60941: we have included files to process 11762 1726853305.60941: generating all_blocks data 11762 1726853305.60944: done generating all_blocks data 11762 1726853305.60948: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11762 1726853305.60949: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11762 1726853305.60951: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11762 1726853305.61116: in VariableManager get_vars() 11762 1726853305.61132: done with get_vars() 11762 1726853305.61157: in VariableManager get_vars() 11762 1726853305.61172: done with get_vars() 11762 1726853305.61181: done processing included file 11762 1726853305.61182: iterating over new_blocks loaded from include file 11762 1726853305.61183: in VariableManager get_vars() 11762 1726853305.61194: done with get_vars() 11762 1726853305.61195: filtering new block on tags 11762 1726853305.61224: done filtering new block on tags 11762 1726853305.61225: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed_node2 => (item=tasks/assert_bond_options.yml) 11762 1726853305.61229: extending task lists for all hosts with included blocks 11762 1726853305.62981: done extending task lists 11762 1726853305.62982: done processing included files 11762 1726853305.62982: results queue empty 11762 1726853305.62983: checking for any_errors_fatal 11762 1726853305.62986: done checking for any_errors_fatal 11762 1726853305.62987: checking for max_fail_percentage 11762 1726853305.62987: done checking for max_fail_percentage 11762 1726853305.62988: checking to see if all hosts have failed and the running result is not ok 11762 1726853305.62989: done checking to see if all hosts have failed 11762 1726853305.62989: getting the remaining hosts for this loop 11762 1726853305.62990: done getting the remaining hosts for this loop 11762 1726853305.62991: getting the next task for host managed_node2 11762 1726853305.62994: done getting next task for host managed_node2 11762 1726853305.62995: ^ task is: TASK: ** TEST check bond settings 11762 1726853305.62997: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853305.62999: getting variables 11762 1726853305.62999: in VariableManager get_vars() 11762 1726853305.63008: Calling all_inventory to load vars for managed_node2 11762 1726853305.63009: Calling groups_inventory to load vars for managed_node2 11762 1726853305.63010: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853305.63015: Calling all_plugins_play to load vars for managed_node2 11762 1726853305.63016: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853305.63018: Calling groups_plugins_play to load vars for managed_node2 11762 1726853305.63632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853305.64472: done with get_vars() 11762 1726853305.64487: done getting variables 11762 1726853305.64516: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Friday 20 September 2024 13:28:25 -0400 (0:00:00.068) 0:00:56.075 ****** 11762 1726853305.64536: entering _queue_task() for managed_node2/command 11762 1726853305.64781: worker is 1 (out of 1 available) 11762 1726853305.64795: exiting _queue_task() for managed_node2/command 11762 1726853305.64808: done queuing things up, now waiting for results queue to drain 11762 1726853305.64810: waiting for pending results... 11762 1726853305.64998: running TaskExecutor() for managed_node2/TASK: ** TEST check bond settings 11762 1726853305.65063: in run() - task 02083763-bbaf-d845-03d0-000000000c2a 11762 1726853305.65075: variable 'ansible_search_path' from source: unknown 11762 1726853305.65079: variable 'ansible_search_path' from source: unknown 11762 1726853305.65116: variable 'bond_options_to_assert' from source: set_fact 11762 1726853305.65279: variable 'bond_options_to_assert' from source: set_fact 11762 1726853305.65360: variable 'omit' from source: magic vars 11762 1726853305.65460: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853305.65466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853305.65478: variable 'omit' from source: magic vars 11762 1726853305.65641: variable 'ansible_distribution_major_version' from source: facts 11762 1726853305.65651: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853305.65656: variable 'omit' from source: magic vars 11762 1726853305.65688: variable 'omit' from source: magic vars 11762 1726853305.65862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853305.67475: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853305.67523: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853305.67558: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853305.67583: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853305.67602: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853305.67670: variable 'controller_device' from source: play vars 11762 1726853305.67675: variable 'bond_opt' from source: unknown 11762 1726853305.67692: variable 'omit' from source: magic vars 11762 1726853305.67714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853305.67733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853305.67750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853305.67770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853305.67775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853305.67796: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853305.67800: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853305.67802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853305.67866: Set connection var ansible_timeout to 10 11762 1726853305.67976: Set connection var ansible_shell_type to sh 11762 1726853305.67979: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853305.67981: Set connection var ansible_shell_executable to /bin/sh 11762 1726853305.67983: Set connection var ansible_pipelining to False 11762 1726853305.67985: Set connection var ansible_connection to ssh 11762 1726853305.67987: variable 'ansible_shell_executable' from source: unknown 11762 1726853305.67989: variable 'ansible_connection' from source: unknown 11762 1726853305.67992: variable 'ansible_module_compression' from source: unknown 11762 1726853305.67993: variable 'ansible_shell_type' from source: unknown 11762 1726853305.67995: variable 'ansible_shell_executable' from source: unknown 11762 1726853305.67997: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853305.67999: variable 'ansible_pipelining' from source: unknown 11762 1726853305.68000: variable 'ansible_timeout' from source: unknown 11762 1726853305.68002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853305.68074: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853305.68094: variable 'omit' from source: magic vars 11762 1726853305.68104: starting attempt loop 11762 1726853305.68111: running the handler 11762 1726853305.68130: _low_level_execute_command(): starting 11762 1726853305.68141: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853305.68792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853305.68807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853305.68820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853305.68887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.68930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853305.68951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853305.68977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853305.69077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853305.70787: stdout chunk (state=3): >>>/root <<< 11762 1726853305.71276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853305.71279: stdout chunk (state=3): >>><<< 11762 1726853305.71281: stderr chunk (state=3): >>><<< 11762 1726853305.71285: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853305.71293: _low_level_execute_command(): starting 11762 1726853305.71296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024 `" && echo ansible-tmp-1726853305.7107284-14420-279351658800024="` echo /root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024 `" ) && sleep 0' 11762 1726853305.72015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853305.72026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853305.72041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853305.72055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853305.72067: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853305.72079: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853305.72177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.72182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853305.72184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853305.72200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853305.72298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853305.75189: stdout chunk (state=3): >>>ansible-tmp-1726853305.7107284-14420-279351658800024=/root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024 <<< 11762 1726853305.75193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853305.75195: stdout chunk (state=3): >>><<< 11762 1726853305.75197: stderr chunk (state=3): >>><<< 11762 1726853305.75199: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853305.7107284-14420-279351658800024=/root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853305.75201: variable 'ansible_module_compression' from source: unknown 11762 1726853305.75203: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853305.75230: variable 'ansible_facts' from source: unknown 11762 1726853305.75432: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024/AnsiballZ_command.py 11762 1726853305.75749: Sending initial data 11762 1726853305.75759: Sent initial data (156 bytes) 11762 1726853305.76317: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853305.76332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853305.76393: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.76406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853305.76498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853305.78214: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853305.78265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853305.78340: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp0vwa04h6 /root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024/AnsiballZ_command.py <<< 11762 1726853305.78346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024/AnsiballZ_command.py" <<< 11762 1726853305.78552: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp0vwa04h6" to remote "/root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024/AnsiballZ_command.py" <<< 11762 1726853305.80331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853305.80335: stdout chunk (state=3): >>><<< 11762 1726853305.80341: stderr chunk (state=3): >>><<< 11762 1726853305.80976: done transferring module to remote 11762 1726853305.80980: _low_level_execute_command(): starting 11762 1726853305.80983: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024/ /root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024/AnsiballZ_command.py && sleep 0' 11762 1726853305.81705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853305.81709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.81920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.81964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853305.81968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853305.82027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853305.82070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853305.84102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853305.84106: stdout chunk (state=3): >>><<< 11762 1726853305.84113: stderr chunk (state=3): >>><<< 11762 1726853305.84129: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853305.84132: _low_level_execute_command(): starting 11762 1726853305.84137: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024/AnsiballZ_command.py && sleep 0' 11762 1726853305.85377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.85382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853305.85385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853305.85387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853305.85792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.02307: stdout chunk (state=3): >>> {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 13:28:26.018949", "end": "2024-09-20 13:28:26.022106", "delta": "0:00:00.003157", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853306.04078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853306.04082: stdout chunk (state=3): >>><<< 11762 1726853306.04086: stderr chunk (state=3): >>><<< 11762 1726853306.04088: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 13:28:26.018949", "end": "2024-09-20 13:28:26.022106", "delta": "0:00:00.003157", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853306.04091: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853306.04098: _low_level_execute_command(): starting 11762 1726853306.04101: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853305.7107284-14420-279351658800024/ > /dev/null 2>&1 && sleep 0' 11762 1726853306.04722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.04730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.04740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.04760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.04769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853306.04778: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853306.04788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.04812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853306.04820: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853306.04981: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.04984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.04986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.04988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.05055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.06981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.06985: stdout chunk (state=3): >>><<< 11762 1726853306.06989: stderr chunk (state=3): >>><<< 11762 1726853306.07176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.07180: handler run complete 11762 1726853306.07182: Evaluated conditional (False): False 11762 1726853306.07218: variable 'bond_opt' from source: unknown 11762 1726853306.07230: variable 'result' from source: set_fact 11762 1726853306.07248: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853306.07264: attempt loop complete, returning result 11762 1726853306.07290: variable 'bond_opt' from source: unknown 11762 1726853306.07369: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'mode', 'value': 'active-backup'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "active-backup" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.003157", "end": "2024-09-20 13:28:26.022106", "rc": 0, "start": "2024-09-20 13:28:26.018949" } STDOUT: active-backup 1 11762 1726853306.07758: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853306.07761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853306.07764: variable 'omit' from source: magic vars 11762 1726853306.07840: variable 'ansible_distribution_major_version' from source: facts 11762 1726853306.07850: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853306.07866: variable 'omit' from source: magic vars 11762 1726853306.07886: variable 'omit' from source: magic vars 11762 1726853306.08059: variable 'controller_device' from source: play vars 11762 1726853306.08067: variable 'bond_opt' from source: unknown 11762 1726853306.08098: variable 'omit' from source: magic vars 11762 1726853306.08177: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853306.08187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853306.08192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853306.08195: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853306.08197: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853306.08199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853306.08259: Set connection var ansible_timeout to 10 11762 1726853306.08266: Set connection var ansible_shell_type to sh 11762 1726853306.08278: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853306.08287: Set connection var ansible_shell_executable to /bin/sh 11762 1726853306.08310: Set connection var ansible_pipelining to False 11762 1726853306.08320: Set connection var ansible_connection to ssh 11762 1726853306.08344: variable 'ansible_shell_executable' from source: unknown 11762 1726853306.08351: variable 'ansible_connection' from source: unknown 11762 1726853306.08358: variable 'ansible_module_compression' from source: unknown 11762 1726853306.08409: variable 'ansible_shell_type' from source: unknown 11762 1726853306.08412: variable 'ansible_shell_executable' from source: unknown 11762 1726853306.08414: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853306.08416: variable 'ansible_pipelining' from source: unknown 11762 1726853306.08418: variable 'ansible_timeout' from source: unknown 11762 1726853306.08420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853306.08492: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853306.08506: variable 'omit' from source: magic vars 11762 1726853306.08525: starting attempt loop 11762 1726853306.08533: running the handler 11762 1726853306.08546: _low_level_execute_command(): starting 11762 1726853306.08628: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853306.09194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.09207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.09228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.09246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.09291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.09358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.09379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.09410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.09523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.11253: stdout chunk (state=3): >>>/root <<< 11762 1726853306.11403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.11407: stdout chunk (state=3): >>><<< 11762 1726853306.11409: stderr chunk (state=3): >>><<< 11762 1726853306.11426: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.11442: _low_level_execute_command(): starting 11762 1726853306.11518: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080 `" && echo ansible-tmp-1726853306.1143253-14420-187568384899080="` echo /root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080 `" ) && sleep 0' 11762 1726853306.12082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.12097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.12119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.12156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.12231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.12238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.12261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.12376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.14424: stdout chunk (state=3): >>>ansible-tmp-1726853306.1143253-14420-187568384899080=/root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080 <<< 11762 1726853306.14543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.14592: stderr chunk (state=3): >>><<< 11762 1726853306.14595: stdout chunk (state=3): >>><<< 11762 1726853306.14609: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853306.1143253-14420-187568384899080=/root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.14778: variable 'ansible_module_compression' from source: unknown 11762 1726853306.14781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853306.14783: variable 'ansible_facts' from source: unknown 11762 1726853306.14785: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080/AnsiballZ_command.py 11762 1726853306.14900: Sending initial data 11762 1726853306.15003: Sent initial data (156 bytes) 11762 1726853306.15513: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.15616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.15644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.15734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.17383: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11762 1726853306.17398: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853306.17467: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853306.17536: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp4k4v8_fn /root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080/AnsiballZ_command.py <<< 11762 1726853306.17539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080/AnsiballZ_command.py" <<< 11762 1726853306.17645: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp4k4v8_fn" to remote "/root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080/AnsiballZ_command.py" <<< 11762 1726853306.17651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080/AnsiballZ_command.py" <<< 11762 1726853306.18522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.18537: stderr chunk (state=3): >>><<< 11762 1726853306.18540: stdout chunk (state=3): >>><<< 11762 1726853306.18585: done transferring module to remote 11762 1726853306.18595: _low_level_execute_command(): starting 11762 1726853306.18598: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080/ /root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080/AnsiballZ_command.py && sleep 0' 11762 1726853306.19300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.19345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.19363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.19379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.19511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.21380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.21407: stderr chunk (state=3): >>><<< 11762 1726853306.21410: stdout chunk (state=3): >>><<< 11762 1726853306.21424: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.21431: _low_level_execute_command(): starting 11762 1726853306.21433: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080/AnsiballZ_command.py && sleep 0' 11762 1726853306.22177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.22180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.22191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.22194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.22196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853306.22378: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853306.22381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.22384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853306.22385: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853306.22387: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853306.22389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.22390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.22392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.22393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853306.22395: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853306.22397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.22398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.22400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.22401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.22564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.38846: stdout chunk (state=3): >>> {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-20 13:28:26.384138", "end": "2024-09-20 13:28:26.387454", "delta": "0:00:00.003316", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853306.40508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853306.40513: stdout chunk (state=3): >>><<< 11762 1726853306.40515: stderr chunk (state=3): >>><<< 11762 1726853306.40577: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-20 13:28:26.384138", "end": "2024-09-20 13:28:26.387454", "delta": "0:00:00.003316", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853306.40580: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853306.40582: _low_level_execute_command(): starting 11762 1726853306.40585: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853306.1143253-14420-187568384899080/ > /dev/null 2>&1 && sleep 0' 11762 1726853306.41203: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.41223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.41236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.41254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.41270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853306.41283: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853306.41337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.41391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.41408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.41433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.41540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.43618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.43648: stderr chunk (state=3): >>><<< 11762 1726853306.43666: stdout chunk (state=3): >>><<< 11762 1726853306.43715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.43718: handler run complete 11762 1726853306.43724: Evaluated conditional (False): False 11762 1726853306.43888: variable 'bond_opt' from source: unknown 11762 1726853306.43899: variable 'result' from source: set_fact 11762 1726853306.43915: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853306.44042: attempt loop complete, returning result 11762 1726853306.44045: variable 'bond_opt' from source: unknown 11762 1726853306.44048: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'arp_interval', 'value': '60'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_interval", "value": "60" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_interval" ], "delta": "0:00:00.003316", "end": "2024-09-20 13:28:26.387454", "rc": 0, "start": "2024-09-20 13:28:26.384138" } STDOUT: 60 11762 1726853306.44382: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853306.44385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853306.44388: variable 'omit' from source: magic vars 11762 1726853306.44444: variable 'ansible_distribution_major_version' from source: facts 11762 1726853306.44455: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853306.44463: variable 'omit' from source: magic vars 11762 1726853306.44482: variable 'omit' from source: magic vars 11762 1726853306.44655: variable 'controller_device' from source: play vars 11762 1726853306.44664: variable 'bond_opt' from source: unknown 11762 1726853306.44687: variable 'omit' from source: magic vars 11762 1726853306.44775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853306.44779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853306.44781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853306.44784: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853306.44786: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853306.44788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853306.44847: Set connection var ansible_timeout to 10 11762 1726853306.44855: Set connection var ansible_shell_type to sh 11762 1726853306.44864: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853306.44875: Set connection var ansible_shell_executable to /bin/sh 11762 1726853306.44886: Set connection var ansible_pipelining to False 11762 1726853306.44896: Set connection var ansible_connection to ssh 11762 1726853306.44919: variable 'ansible_shell_executable' from source: unknown 11762 1726853306.44927: variable 'ansible_connection' from source: unknown 11762 1726853306.44976: variable 'ansible_module_compression' from source: unknown 11762 1726853306.44979: variable 'ansible_shell_type' from source: unknown 11762 1726853306.44981: variable 'ansible_shell_executable' from source: unknown 11762 1726853306.44983: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853306.44985: variable 'ansible_pipelining' from source: unknown 11762 1726853306.44986: variable 'ansible_timeout' from source: unknown 11762 1726853306.44988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853306.45066: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853306.45080: variable 'omit' from source: magic vars 11762 1726853306.45088: starting attempt loop 11762 1726853306.45094: running the handler 11762 1726853306.45103: _low_level_execute_command(): starting 11762 1726853306.45158: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853306.45740: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.45752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.45766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.45787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.45814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853306.45908: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.45937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.46046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.47747: stdout chunk (state=3): >>>/root <<< 11762 1726853306.47901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.47904: stdout chunk (state=3): >>><<< 11762 1726853306.47907: stderr chunk (state=3): >>><<< 11762 1726853306.48005: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.48009: _low_level_execute_command(): starting 11762 1726853306.48011: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980 `" && echo ansible-tmp-1726853306.4792876-14420-57517439965980="` echo /root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980 `" ) && sleep 0' 11762 1726853306.48504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.48507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853306.48526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853306.48529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.48581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.48591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.48658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.50628: stdout chunk (state=3): >>>ansible-tmp-1726853306.4792876-14420-57517439965980=/root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980 <<< 11762 1726853306.50818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.50821: stdout chunk (state=3): >>><<< 11762 1726853306.50824: stderr chunk (state=3): >>><<< 11762 1726853306.50837: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853306.4792876-14420-57517439965980=/root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.50859: variable 'ansible_module_compression' from source: unknown 11762 1726853306.50891: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853306.50905: variable 'ansible_facts' from source: unknown 11762 1726853306.50957: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980/AnsiballZ_command.py 11762 1726853306.51048: Sending initial data 11762 1726853306.51052: Sent initial data (155 bytes) 11762 1726853306.51466: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.51496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.51500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.51503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.51518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.51567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.51573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.51646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.53267: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853306.53359: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853306.53436: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpuj4f1dio /root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980/AnsiballZ_command.py <<< 11762 1726853306.53442: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980/AnsiballZ_command.py" <<< 11762 1726853306.53499: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpuj4f1dio" to remote "/root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980/AnsiballZ_command.py" <<< 11762 1726853306.54327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.54363: stderr chunk (state=3): >>><<< 11762 1726853306.54375: stdout chunk (state=3): >>><<< 11762 1726853306.54450: done transferring module to remote 11762 1726853306.54464: _low_level_execute_command(): starting 11762 1726853306.54544: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980/ /root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980/AnsiballZ_command.py && sleep 0' 11762 1726853306.55165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.55182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.55198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.55226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.55292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.55349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.55374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.55390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.55661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.57496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.57623: stderr chunk (state=3): >>><<< 11762 1726853306.57627: stdout chunk (state=3): >>><<< 11762 1726853306.57629: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.57632: _low_level_execute_command(): starting 11762 1726853306.57635: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980/AnsiballZ_command.py && sleep 0' 11762 1726853306.58151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.58166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.58187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.58213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.58232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853306.58287: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.58337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.58351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.58381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.58534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.75198: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-20 13:28:26.747124", "end": "2024-09-20 13:28:26.750278", "delta": "0:00:00.003154", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853306.76738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.76756: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 11762 1726853306.76809: stderr chunk (state=3): >>><<< 11762 1726853306.76823: stdout chunk (state=3): >>><<< 11762 1726853306.76854: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-20 13:28:26.747124", "end": "2024-09-20 13:28:26.750278", "delta": "0:00:00.003154", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853306.76891: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_ip_target', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853306.76901: _low_level_execute_command(): starting 11762 1726853306.76910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853306.4792876-14420-57517439965980/ > /dev/null 2>&1 && sleep 0' 11762 1726853306.77526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.77540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.77557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.77579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.77598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853306.77638: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.77714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.77741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.77765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.77882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.79787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.79829: stderr chunk (state=3): >>><<< 11762 1726853306.79832: stdout chunk (state=3): >>><<< 11762 1726853306.79858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.79866: handler run complete 11762 1726853306.79884: Evaluated conditional (False): False 11762 1726853306.80276: variable 'bond_opt' from source: unknown 11762 1726853306.80280: variable 'result' from source: set_fact 11762 1726853306.80283: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853306.80286: attempt loop complete, returning result 11762 1726853306.80289: variable 'bond_opt' from source: unknown 11762 1726853306.80291: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'arp_ip_target', 'value': '192.0.2.128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_ip_target", "value": "192.0.2.128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_ip_target" ], "delta": "0:00:00.003154", "end": "2024-09-20 13:28:26.750278", "rc": 0, "start": "2024-09-20 13:28:26.747124" } STDOUT: 192.0.2.128 11762 1726853306.80394: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853306.80398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853306.80400: variable 'omit' from source: magic vars 11762 1726853306.80460: variable 'ansible_distribution_major_version' from source: facts 11762 1726853306.80475: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853306.80483: variable 'omit' from source: magic vars 11762 1726853306.80505: variable 'omit' from source: magic vars 11762 1726853306.80670: variable 'controller_device' from source: play vars 11762 1726853306.80691: variable 'bond_opt' from source: unknown 11762 1726853306.80712: variable 'omit' from source: magic vars 11762 1726853306.80735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853306.80746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853306.80756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853306.80773: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853306.80780: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853306.80793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853306.80865: Set connection var ansible_timeout to 10 11762 1726853306.80875: Set connection var ansible_shell_type to sh 11762 1726853306.80884: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853306.80898: Set connection var ansible_shell_executable to /bin/sh 11762 1726853306.80910: Set connection var ansible_pipelining to False 11762 1726853306.80920: Set connection var ansible_connection to ssh 11762 1726853306.80940: variable 'ansible_shell_executable' from source: unknown 11762 1726853306.80947: variable 'ansible_connection' from source: unknown 11762 1726853306.80953: variable 'ansible_module_compression' from source: unknown 11762 1726853306.80959: variable 'ansible_shell_type' from source: unknown 11762 1726853306.80965: variable 'ansible_shell_executable' from source: unknown 11762 1726853306.81004: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853306.81011: variable 'ansible_pipelining' from source: unknown 11762 1726853306.81013: variable 'ansible_timeout' from source: unknown 11762 1726853306.81015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853306.81089: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853306.81101: variable 'omit' from source: magic vars 11762 1726853306.81113: starting attempt loop 11762 1726853306.81123: running the handler 11762 1726853306.81199: _low_level_execute_command(): starting 11762 1726853306.81202: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853306.81756: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.81839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.81880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.81894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.81918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.82029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.83787: stdout chunk (state=3): >>>/root <<< 11762 1726853306.83944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.83953: stdout chunk (state=3): >>><<< 11762 1726853306.83963: stderr chunk (state=3): >>><<< 11762 1726853306.84065: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.84069: _low_level_execute_command(): starting 11762 1726853306.84074: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116 `" && echo ansible-tmp-1726853306.839849-14420-140424267627116="` echo /root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116 `" ) && sleep 0' 11762 1726853306.84604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.84618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.84632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853306.84656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853306.84676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853306.84690: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853306.84782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.84802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.84816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.84922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.86963: stdout chunk (state=3): >>>ansible-tmp-1726853306.839849-14420-140424267627116=/root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116 <<< 11762 1726853306.87089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.87129: stderr chunk (state=3): >>><<< 11762 1726853306.87132: stdout chunk (state=3): >>><<< 11762 1726853306.87276: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853306.839849-14420-140424267627116=/root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.87280: variable 'ansible_module_compression' from source: unknown 11762 1726853306.87282: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853306.87285: variable 'ansible_facts' from source: unknown 11762 1726853306.87312: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116/AnsiballZ_command.py 11762 1726853306.87497: Sending initial data 11762 1726853306.87501: Sent initial data (155 bytes) 11762 1726853306.88077: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853306.88159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853306.88176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.88212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853306.88232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.88253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.88361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.90054: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853306.90141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853306.90209: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpp_eh0sgg /root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116/AnsiballZ_command.py <<< 11762 1726853306.90212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116/AnsiballZ_command.py" <<< 11762 1726853306.90274: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpp_eh0sgg" to remote "/root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116/AnsiballZ_command.py" <<< 11762 1726853306.91247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.91390: stderr chunk (state=3): >>><<< 11762 1726853306.91393: stdout chunk (state=3): >>><<< 11762 1726853306.91395: done transferring module to remote 11762 1726853306.91397: _low_level_execute_command(): starting 11762 1726853306.91399: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116/ /root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116/AnsiballZ_command.py && sleep 0' 11762 1726853306.91950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.91958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853306.91990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.92082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.92086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.92198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853306.94149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853306.94378: stderr chunk (state=3): >>><<< 11762 1726853306.94382: stdout chunk (state=3): >>><<< 11762 1726853306.94385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853306.94388: _low_level_execute_command(): starting 11762 1726853306.94390: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116/AnsiballZ_command.py && sleep 0' 11762 1726853306.95394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853306.95521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853306.95537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853306.95652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.11636: stdout chunk (state=3): >>> {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-20 13:28:27.112186", "end": "2024-09-20 13:28:27.115310", "delta": "0:00:00.003124", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853307.13364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.13469: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 11762 1726853307.13475: stdout chunk (state=3): >>><<< 11762 1726853307.13478: stderr chunk (state=3): >>><<< 11762 1726853307.13480: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-20 13:28:27.112186", "end": "2024-09-20 13:28:27.115310", "delta": "0:00:00.003124", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853307.13483: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_validate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853307.13485: _low_level_execute_command(): starting 11762 1726853307.13487: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853306.839849-14420-140424267627116/ > /dev/null 2>&1 && sleep 0' 11762 1726853307.14167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853307.14281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853307.14312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.14420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.16598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.16608: stdout chunk (state=3): >>><<< 11762 1726853307.16621: stderr chunk (state=3): >>><<< 11762 1726853307.16686: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853307.16690: handler run complete 11762 1726853307.16692: Evaluated conditional (False): False 11762 1726853307.16848: variable 'bond_opt' from source: unknown 11762 1726853307.16860: variable 'result' from source: set_fact 11762 1726853307.16881: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853307.16897: attempt loop complete, returning result 11762 1726853307.17076: variable 'bond_opt' from source: unknown 11762 1726853307.17079: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'arp_validate', 'value': 'none'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_validate", "value": "none" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_validate" ], "delta": "0:00:00.003124", "end": "2024-09-20 13:28:27.115310", "rc": 0, "start": "2024-09-20 13:28:27.112186" } STDOUT: none 0 11762 1726853307.17205: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853307.17219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853307.17237: variable 'omit' from source: magic vars 11762 1726853307.17394: variable 'ansible_distribution_major_version' from source: facts 11762 1726853307.17400: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853307.17406: variable 'omit' from source: magic vars 11762 1726853307.17418: variable 'omit' from source: magic vars 11762 1726853307.17580: variable 'controller_device' from source: play vars 11762 1726853307.17591: variable 'bond_opt' from source: unknown 11762 1726853307.17624: variable 'omit' from source: magic vars 11762 1726853307.17627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853307.17636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853307.17676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853307.17679: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853307.17681: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853307.17683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853307.17738: Set connection var ansible_timeout to 10 11762 1726853307.17741: Set connection var ansible_shell_type to sh 11762 1726853307.17746: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853307.17749: Set connection var ansible_shell_executable to /bin/sh 11762 1726853307.17757: Set connection var ansible_pipelining to False 11762 1726853307.17780: Set connection var ansible_connection to ssh 11762 1726853307.17783: variable 'ansible_shell_executable' from source: unknown 11762 1726853307.17785: variable 'ansible_connection' from source: unknown 11762 1726853307.17788: variable 'ansible_module_compression' from source: unknown 11762 1726853307.17790: variable 'ansible_shell_type' from source: unknown 11762 1726853307.17792: variable 'ansible_shell_executable' from source: unknown 11762 1726853307.17841: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853307.17846: variable 'ansible_pipelining' from source: unknown 11762 1726853307.17849: variable 'ansible_timeout' from source: unknown 11762 1726853307.17851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853307.17903: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853307.17918: variable 'omit' from source: magic vars 11762 1726853307.17921: starting attempt loop 11762 1726853307.17924: running the handler 11762 1726853307.17931: _low_level_execute_command(): starting 11762 1726853307.17934: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853307.18457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.18461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853307.18463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853307.18465: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.18468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.18511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853307.18514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.18593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.20291: stdout chunk (state=3): >>>/root <<< 11762 1726853307.20434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.20437: stdout chunk (state=3): >>><<< 11762 1726853307.20440: stderr chunk (state=3): >>><<< 11762 1726853307.20531: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853307.20541: _low_level_execute_command(): starting 11762 1726853307.20543: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758 `" && echo ansible-tmp-1726853307.204597-14420-64760327291758="` echo /root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758 `" ) && sleep 0' 11762 1726853307.20942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.20955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.20965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.21023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853307.21026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.21105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.23375: stdout chunk (state=3): >>>ansible-tmp-1726853307.204597-14420-64760327291758=/root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758 <<< 11762 1726853307.23494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.23532: stderr chunk (state=3): >>><<< 11762 1726853307.23535: stdout chunk (state=3): >>><<< 11762 1726853307.23633: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853307.204597-14420-64760327291758=/root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853307.23636: variable 'ansible_module_compression' from source: unknown 11762 1726853307.23639: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853307.23680: variable 'ansible_facts' from source: unknown 11762 1726853307.23704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758/AnsiballZ_command.py 11762 1726853307.23805: Sending initial data 11762 1726853307.23809: Sent initial data (154 bytes) 11762 1726853307.24208: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.24211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.24214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.24216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.24261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853307.24267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.24338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.26004: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853307.26077: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853307.26141: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp7q8elu4_ /root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758/AnsiballZ_command.py <<< 11762 1726853307.26148: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758/AnsiballZ_command.py" <<< 11762 1726853307.26216: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmp7q8elu4_" to remote "/root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758/AnsiballZ_command.py" <<< 11762 1726853307.27026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.27127: stderr chunk (state=3): >>><<< 11762 1726853307.27130: stdout chunk (state=3): >>><<< 11762 1726853307.27132: done transferring module to remote 11762 1726853307.27134: _low_level_execute_command(): starting 11762 1726853307.27136: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758/ /root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758/AnsiballZ_command.py && sleep 0' 11762 1726853307.27575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.27588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853307.27610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.27613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.27615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.27660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853307.27676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.27744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.30458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.30480: stderr chunk (state=3): >>><<< 11762 1726853307.30483: stdout chunk (state=3): >>><<< 11762 1726853307.30500: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853307.30503: _low_level_execute_command(): starting 11762 1726853307.30506: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758/AnsiballZ_command.py && sleep 0' 11762 1726853307.30936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.30939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.30941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853307.30943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.30989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853307.30993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.31080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.47648: stdout chunk (state=3): >>> {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-20 13:28:27.472357", "end": "2024-09-20 13:28:27.475547", "delta": "0:00:00.003190", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853307.49211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853307.49242: stderr chunk (state=3): >>><<< 11762 1726853307.49245: stdout chunk (state=3): >>><<< 11762 1726853307.49261: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-20 13:28:27.472357", "end": "2024-09-20 13:28:27.475547", "delta": "0:00:00.003190", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853307.49284: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/primary', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853307.49289: _low_level_execute_command(): starting 11762 1726853307.49296: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853307.204597-14420-64760327291758/ > /dev/null 2>&1 && sleep 0' 11762 1726853307.49742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853307.49748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853307.49750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.49752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.49754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.49807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853307.49814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853307.49816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.49889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.51788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.51816: stderr chunk (state=3): >>><<< 11762 1726853307.51820: stdout chunk (state=3): >>><<< 11762 1726853307.51832: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853307.51837: handler run complete 11762 1726853307.51854: Evaluated conditional (False): False 11762 1726853307.51963: variable 'bond_opt' from source: unknown 11762 1726853307.51968: variable 'result' from source: set_fact 11762 1726853307.51980: Evaluated conditional (bond_opt.value in result.stdout): True 11762 1726853307.51988: attempt loop complete, returning result 11762 1726853307.52002: variable 'bond_opt' from source: unknown 11762 1726853307.52050: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'primary', 'value': 'test1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "primary", "value": "test1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/primary" ], "delta": "0:00:00.003190", "end": "2024-09-20 13:28:27.475547", "rc": 0, "start": "2024-09-20 13:28:27.472357" } STDOUT: test1 11762 1726853307.52179: dumping result to json 11762 1726853307.52182: done dumping result, returning 11762 1726853307.52184: done running TaskExecutor() for managed_node2/TASK: ** TEST check bond settings [02083763-bbaf-d845-03d0-000000000c2a] 11762 1726853307.52186: sending task result for task 02083763-bbaf-d845-03d0-000000000c2a 11762 1726853307.52238: done sending task result for task 02083763-bbaf-d845-03d0-000000000c2a 11762 1726853307.52240: WORKER PROCESS EXITING 11762 1726853307.52377: no more pending results, returning what we have 11762 1726853307.52381: results queue empty 11762 1726853307.52382: checking for any_errors_fatal 11762 1726853307.52384: done checking for any_errors_fatal 11762 1726853307.52384: checking for max_fail_percentage 11762 1726853307.52386: done checking for max_fail_percentage 11762 1726853307.52386: checking to see if all hosts have failed and the running result is not ok 11762 1726853307.52387: done checking to see if all hosts have failed 11762 1726853307.52388: getting the remaining hosts for this loop 11762 1726853307.52389: done getting the remaining hosts for this loop 11762 1726853307.52392: getting the next task for host managed_node2 11762 1726853307.52398: done getting next task for host managed_node2 11762 1726853307.52400: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 11762 1726853307.52403: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853307.52407: getting variables 11762 1726853307.52408: in VariableManager get_vars() 11762 1726853307.52447: Calling all_inventory to load vars for managed_node2 11762 1726853307.52449: Calling groups_inventory to load vars for managed_node2 11762 1726853307.52451: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853307.52460: Calling all_plugins_play to load vars for managed_node2 11762 1726853307.52462: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853307.52465: Calling groups_plugins_play to load vars for managed_node2 11762 1726853307.53336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853307.54210: done with get_vars() 11762 1726853307.54227: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Friday 20 September 2024 13:28:27 -0400 (0:00:01.897) 0:00:57.973 ****** 11762 1726853307.54297: entering _queue_task() for managed_node2/include_tasks 11762 1726853307.54546: worker is 1 (out of 1 available) 11762 1726853307.54559: exiting _queue_task() for managed_node2/include_tasks 11762 1726853307.54573: done queuing things up, now waiting for results queue to drain 11762 1726853307.54575: waiting for pending results... 11762 1726853307.54756: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv4_present.yml' 11762 1726853307.54830: in run() - task 02083763-bbaf-d845-03d0-000000000c2c 11762 1726853307.54844: variable 'ansible_search_path' from source: unknown 11762 1726853307.54848: variable 'ansible_search_path' from source: unknown 11762 1726853307.54876: calling self._execute() 11762 1726853307.54950: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853307.54954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853307.54963: variable 'omit' from source: magic vars 11762 1726853307.55241: variable 'ansible_distribution_major_version' from source: facts 11762 1726853307.55256: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853307.55261: _execute() done 11762 1726853307.55264: dumping result to json 11762 1726853307.55267: done dumping result, returning 11762 1726853307.55276: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv4_present.yml' [02083763-bbaf-d845-03d0-000000000c2c] 11762 1726853307.55282: sending task result for task 02083763-bbaf-d845-03d0-000000000c2c 11762 1726853307.55366: done sending task result for task 02083763-bbaf-d845-03d0-000000000c2c 11762 1726853307.55369: WORKER PROCESS EXITING 11762 1726853307.55396: no more pending results, returning what we have 11762 1726853307.55401: in VariableManager get_vars() 11762 1726853307.55449: Calling all_inventory to load vars for managed_node2 11762 1726853307.55451: Calling groups_inventory to load vars for managed_node2 11762 1726853307.55453: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853307.55465: Calling all_plugins_play to load vars for managed_node2 11762 1726853307.55468: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853307.55470: Calling groups_plugins_play to load vars for managed_node2 11762 1726853307.56254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853307.57203: done with get_vars() 11762 1726853307.57219: variable 'ansible_search_path' from source: unknown 11762 1726853307.57220: variable 'ansible_search_path' from source: unknown 11762 1726853307.57226: variable 'item' from source: include params 11762 1726853307.57302: variable 'item' from source: include params 11762 1726853307.57327: we have included files to process 11762 1726853307.57328: generating all_blocks data 11762 1726853307.57330: done generating all_blocks data 11762 1726853307.57333: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11762 1726853307.57334: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11762 1726853307.57335: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11762 1726853307.57460: done processing included file 11762 1726853307.57462: iterating over new_blocks loaded from include file 11762 1726853307.57463: in VariableManager get_vars() 11762 1726853307.57479: done with get_vars() 11762 1726853307.57480: filtering new block on tags 11762 1726853307.57497: done filtering new block on tags 11762 1726853307.57498: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed_node2 11762 1726853307.57502: extending task lists for all hosts with included blocks 11762 1726853307.57623: done extending task lists 11762 1726853307.57624: done processing included files 11762 1726853307.57624: results queue empty 11762 1726853307.57625: checking for any_errors_fatal 11762 1726853307.57630: done checking for any_errors_fatal 11762 1726853307.57630: checking for max_fail_percentage 11762 1726853307.57631: done checking for max_fail_percentage 11762 1726853307.57632: checking to see if all hosts have failed and the running result is not ok 11762 1726853307.57632: done checking to see if all hosts have failed 11762 1726853307.57633: getting the remaining hosts for this loop 11762 1726853307.57633: done getting the remaining hosts for this loop 11762 1726853307.57635: getting the next task for host managed_node2 11762 1726853307.57638: done getting next task for host managed_node2 11762 1726853307.57639: ^ task is: TASK: ** TEST check IPv4 11762 1726853307.57642: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853307.57644: getting variables 11762 1726853307.57644: in VariableManager get_vars() 11762 1726853307.57655: Calling all_inventory to load vars for managed_node2 11762 1726853307.57656: Calling groups_inventory to load vars for managed_node2 11762 1726853307.57657: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853307.57661: Calling all_plugins_play to load vars for managed_node2 11762 1726853307.57662: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853307.57664: Calling groups_plugins_play to load vars for managed_node2 11762 1726853307.58288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853307.59113: done with get_vars() 11762 1726853307.59126: done getting variables 11762 1726853307.59155: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Friday 20 September 2024 13:28:27 -0400 (0:00:00.048) 0:00:58.022 ****** 11762 1726853307.59180: entering _queue_task() for managed_node2/command 11762 1726853307.59413: worker is 1 (out of 1 available) 11762 1726853307.59428: exiting _queue_task() for managed_node2/command 11762 1726853307.59442: done queuing things up, now waiting for results queue to drain 11762 1726853307.59444: waiting for pending results... 11762 1726853307.59627: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 11762 1726853307.59714: in run() - task 02083763-bbaf-d845-03d0-000000000da6 11762 1726853307.59727: variable 'ansible_search_path' from source: unknown 11762 1726853307.59730: variable 'ansible_search_path' from source: unknown 11762 1726853307.59760: calling self._execute() 11762 1726853307.59830: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853307.59835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853307.59843: variable 'omit' from source: magic vars 11762 1726853307.60116: variable 'ansible_distribution_major_version' from source: facts 11762 1726853307.60125: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853307.60131: variable 'omit' from source: magic vars 11762 1726853307.60167: variable 'omit' from source: magic vars 11762 1726853307.60280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853307.61708: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853307.61753: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853307.61782: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853307.61806: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853307.61825: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853307.61885: variable 'interface' from source: include params 11762 1726853307.61888: variable 'controller_device' from source: play vars 11762 1726853307.61931: variable 'controller_device' from source: play vars 11762 1726853307.61955: variable 'omit' from source: magic vars 11762 1726853307.61977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853307.61998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853307.62012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853307.62024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853307.62033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853307.62059: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853307.62062: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853307.62066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853307.62130: Set connection var ansible_timeout to 10 11762 1726853307.62133: Set connection var ansible_shell_type to sh 11762 1726853307.62138: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853307.62143: Set connection var ansible_shell_executable to /bin/sh 11762 1726853307.62152: Set connection var ansible_pipelining to False 11762 1726853307.62157: Set connection var ansible_connection to ssh 11762 1726853307.62176: variable 'ansible_shell_executable' from source: unknown 11762 1726853307.62181: variable 'ansible_connection' from source: unknown 11762 1726853307.62184: variable 'ansible_module_compression' from source: unknown 11762 1726853307.62186: variable 'ansible_shell_type' from source: unknown 11762 1726853307.62188: variable 'ansible_shell_executable' from source: unknown 11762 1726853307.62190: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853307.62192: variable 'ansible_pipelining' from source: unknown 11762 1726853307.62194: variable 'ansible_timeout' from source: unknown 11762 1726853307.62196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853307.62270: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853307.62282: variable 'omit' from source: magic vars 11762 1726853307.62285: starting attempt loop 11762 1726853307.62287: running the handler 11762 1726853307.62301: _low_level_execute_command(): starting 11762 1726853307.62314: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853307.62855: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.62859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.62862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853307.62864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853307.62866: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.62928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853307.62933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853307.62936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.63011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.64739: stdout chunk (state=3): >>>/root <<< 11762 1726853307.64840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.64872: stderr chunk (state=3): >>><<< 11762 1726853307.64876: stdout chunk (state=3): >>><<< 11762 1726853307.64895: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853307.64906: _low_level_execute_command(): starting 11762 1726853307.64909: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712 `" && echo ansible-tmp-1726853307.6489367-14530-205906424603712="` echo /root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712 `" ) && sleep 0' 11762 1726853307.65326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853307.65330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853307.65332: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853307.65334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.65383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853307.65386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.65466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.67797: stdout chunk (state=3): >>>ansible-tmp-1726853307.6489367-14530-205906424603712=/root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712 <<< 11762 1726853307.67910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.67929: stderr chunk (state=3): >>><<< 11762 1726853307.67932: stdout chunk (state=3): >>><<< 11762 1726853307.67947: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853307.6489367-14530-205906424603712=/root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853307.67970: variable 'ansible_module_compression' from source: unknown 11762 1726853307.68005: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853307.68036: variable 'ansible_facts' from source: unknown 11762 1726853307.68093: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712/AnsiballZ_command.py 11762 1726853307.68184: Sending initial data 11762 1726853307.68188: Sent initial data (156 bytes) 11762 1726853307.68608: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853307.68611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853307.68613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.68616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.68618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.68669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853307.68678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.68746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.70865: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11762 1726853307.70869: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853307.70934: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853307.71003: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmptss80jym /root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712/AnsiballZ_command.py <<< 11762 1726853307.71006: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712/AnsiballZ_command.py" <<< 11762 1726853307.71297: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmptss80jym" to remote "/root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712/AnsiballZ_command.py" <<< 11762 1726853307.71301: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712/AnsiballZ_command.py" <<< 11762 1726853307.71976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.71985: stderr chunk (state=3): >>><<< 11762 1726853307.71987: stdout chunk (state=3): >>><<< 11762 1726853307.72025: done transferring module to remote 11762 1726853307.72034: _low_level_execute_command(): starting 11762 1726853307.72038: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712/ /root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712/AnsiballZ_command.py && sleep 0' 11762 1726853307.72466: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853307.72469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.72473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853307.72476: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.72478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853307.72480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.72517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853307.72535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.72599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.74469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.74492: stderr chunk (state=3): >>><<< 11762 1726853307.74495: stdout chunk (state=3): >>><<< 11762 1726853307.74507: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853307.74510: _low_level_execute_command(): starting 11762 1726853307.74514: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712/AnsiballZ_command.py && sleep 0' 11762 1726853307.74933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853307.74936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853307.74938: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.74941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.74993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853307.75000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.75076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.91062: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.204/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 240sec preferred_lft 240sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:27.905760", "end": "2024-09-20 13:28:27.909574", "delta": "0:00:00.003814", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853307.92697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853307.92726: stderr chunk (state=3): >>><<< 11762 1726853307.92729: stdout chunk (state=3): >>><<< 11762 1726853307.92746: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.204/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 240sec preferred_lft 240sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:27.905760", "end": "2024-09-20 13:28:27.909574", "delta": "0:00:00.003814", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853307.92780: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853307.92788: _low_level_execute_command(): starting 11762 1726853307.92791: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853307.6489367-14530-205906424603712/ > /dev/null 2>&1 && sleep 0' 11762 1726853307.93240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853307.93246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853307.93250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.93252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853307.93254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853307.93313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853307.93316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853307.93318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853307.93384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853307.95268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853307.95294: stderr chunk (state=3): >>><<< 11762 1726853307.95297: stdout chunk (state=3): >>><<< 11762 1726853307.95312: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853307.95317: handler run complete 11762 1726853307.95335: Evaluated conditional (False): False 11762 1726853307.95445: variable 'address' from source: include params 11762 1726853307.95449: variable 'result' from source: set_fact 11762 1726853307.95464: Evaluated conditional (address in result.stdout): True 11762 1726853307.95475: attempt loop complete, returning result 11762 1726853307.95478: _execute() done 11762 1726853307.95480: dumping result to json 11762 1726853307.95484: done dumping result, returning 11762 1726853307.95491: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 [02083763-bbaf-d845-03d0-000000000da6] 11762 1726853307.95495: sending task result for task 02083763-bbaf-d845-03d0-000000000da6 11762 1726853307.95592: done sending task result for task 02083763-bbaf-d845-03d0-000000000da6 11762 1726853307.95595: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003814", "end": "2024-09-20 13:28:27.909574", "rc": 0, "start": "2024-09-20 13:28:27.905760" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.204/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 240sec preferred_lft 240sec 11762 1726853307.95698: no more pending results, returning what we have 11762 1726853307.95701: results queue empty 11762 1726853307.95702: checking for any_errors_fatal 11762 1726853307.95709: done checking for any_errors_fatal 11762 1726853307.95710: checking for max_fail_percentage 11762 1726853307.95712: done checking for max_fail_percentage 11762 1726853307.95713: checking to see if all hosts have failed and the running result is not ok 11762 1726853307.95714: done checking to see if all hosts have failed 11762 1726853307.95714: getting the remaining hosts for this loop 11762 1726853307.95716: done getting the remaining hosts for this loop 11762 1726853307.95719: getting the next task for host managed_node2 11762 1726853307.95727: done getting next task for host managed_node2 11762 1726853307.95730: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 11762 1726853307.95733: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853307.95737: getting variables 11762 1726853307.95738: in VariableManager get_vars() 11762 1726853307.95781: Calling all_inventory to load vars for managed_node2 11762 1726853307.95784: Calling groups_inventory to load vars for managed_node2 11762 1726853307.95786: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853307.95795: Calling all_plugins_play to load vars for managed_node2 11762 1726853307.95797: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853307.95800: Calling groups_plugins_play to load vars for managed_node2 11762 1726853307.96677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853307.98173: done with get_vars() 11762 1726853307.98197: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Friday 20 September 2024 13:28:27 -0400 (0:00:00.391) 0:00:58.413 ****** 11762 1726853307.98306: entering _queue_task() for managed_node2/include_tasks 11762 1726853307.98555: worker is 1 (out of 1 available) 11762 1726853307.98569: exiting _queue_task() for managed_node2/include_tasks 11762 1726853307.98585: done queuing things up, now waiting for results queue to drain 11762 1726853307.98587: waiting for pending results... 11762 1726853307.98767: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv6_present.yml' 11762 1726853307.98839: in run() - task 02083763-bbaf-d845-03d0-000000000c2d 11762 1726853307.98852: variable 'ansible_search_path' from source: unknown 11762 1726853307.98856: variable 'ansible_search_path' from source: unknown 11762 1726853307.98885: calling self._execute() 11762 1726853307.98958: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853307.98963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853307.98973: variable 'omit' from source: magic vars 11762 1726853307.99248: variable 'ansible_distribution_major_version' from source: facts 11762 1726853307.99261: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853307.99264: _execute() done 11762 1726853307.99267: dumping result to json 11762 1726853307.99270: done dumping result, returning 11762 1726853307.99274: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv6_present.yml' [02083763-bbaf-d845-03d0-000000000c2d] 11762 1726853307.99280: sending task result for task 02083763-bbaf-d845-03d0-000000000c2d 11762 1726853307.99375: done sending task result for task 02083763-bbaf-d845-03d0-000000000c2d 11762 1726853307.99378: WORKER PROCESS EXITING 11762 1726853307.99406: no more pending results, returning what we have 11762 1726853307.99410: in VariableManager get_vars() 11762 1726853307.99459: Calling all_inventory to load vars for managed_node2 11762 1726853307.99462: Calling groups_inventory to load vars for managed_node2 11762 1726853307.99464: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853307.99480: Calling all_plugins_play to load vars for managed_node2 11762 1726853307.99482: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853307.99485: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.00331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.01883: done with get_vars() 11762 1726853308.01903: variable 'ansible_search_path' from source: unknown 11762 1726853308.01904: variable 'ansible_search_path' from source: unknown 11762 1726853308.01912: variable 'item' from source: include params 11762 1726853308.02008: variable 'item' from source: include params 11762 1726853308.02036: we have included files to process 11762 1726853308.02037: generating all_blocks data 11762 1726853308.02039: done generating all_blocks data 11762 1726853308.02046: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11762 1726853308.02047: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11762 1726853308.02048: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11762 1726853308.02211: done processing included file 11762 1726853308.02214: iterating over new_blocks loaded from include file 11762 1726853308.02215: in VariableManager get_vars() 11762 1726853308.02237: done with get_vars() 11762 1726853308.02239: filtering new block on tags 11762 1726853308.02267: done filtering new block on tags 11762 1726853308.02270: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed_node2 11762 1726853308.02277: extending task lists for all hosts with included blocks 11762 1726853308.02596: done extending task lists 11762 1726853308.02597: done processing included files 11762 1726853308.02598: results queue empty 11762 1726853308.02599: checking for any_errors_fatal 11762 1726853308.02604: done checking for any_errors_fatal 11762 1726853308.02604: checking for max_fail_percentage 11762 1726853308.02606: done checking for max_fail_percentage 11762 1726853308.02606: checking to see if all hosts have failed and the running result is not ok 11762 1726853308.02607: done checking to see if all hosts have failed 11762 1726853308.02608: getting the remaining hosts for this loop 11762 1726853308.02609: done getting the remaining hosts for this loop 11762 1726853308.02611: getting the next task for host managed_node2 11762 1726853308.02615: done getting next task for host managed_node2 11762 1726853308.02617: ^ task is: TASK: ** TEST check IPv6 11762 1726853308.02620: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853308.02622: getting variables 11762 1726853308.02623: in VariableManager get_vars() 11762 1726853308.02636: Calling all_inventory to load vars for managed_node2 11762 1726853308.02638: Calling groups_inventory to load vars for managed_node2 11762 1726853308.02640: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.02649: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.02651: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.02654: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.03866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.05583: done with get_vars() 11762 1726853308.05605: done getting variables 11762 1726853308.05652: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Friday 20 September 2024 13:28:28 -0400 (0:00:00.073) 0:00:58.487 ****** 11762 1726853308.05685: entering _queue_task() for managed_node2/command 11762 1726853308.06031: worker is 1 (out of 1 available) 11762 1726853308.06047: exiting _queue_task() for managed_node2/command 11762 1726853308.06060: done queuing things up, now waiting for results queue to drain 11762 1726853308.06062: waiting for pending results... 11762 1726853308.06572: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 11762 1726853308.06579: in run() - task 02083763-bbaf-d845-03d0-000000000dc7 11762 1726853308.06583: variable 'ansible_search_path' from source: unknown 11762 1726853308.06586: variable 'ansible_search_path' from source: unknown 11762 1726853308.06589: calling self._execute() 11762 1726853308.06660: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.06679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.06696: variable 'omit' from source: magic vars 11762 1726853308.07083: variable 'ansible_distribution_major_version' from source: facts 11762 1726853308.07104: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853308.07118: variable 'omit' from source: magic vars 11762 1726853308.07183: variable 'omit' from source: magic vars 11762 1726853308.07358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853308.09746: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853308.09825: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853308.09904: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853308.09916: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853308.09950: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853308.10047: variable 'controller_device' from source: play vars 11762 1726853308.10121: variable 'omit' from source: magic vars 11762 1726853308.10124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853308.10151: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853308.10174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853308.10197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853308.10214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853308.10259: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853308.10274: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.10339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.10396: Set connection var ansible_timeout to 10 11762 1726853308.10407: Set connection var ansible_shell_type to sh 11762 1726853308.10420: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853308.10431: Set connection var ansible_shell_executable to /bin/sh 11762 1726853308.10454: Set connection var ansible_pipelining to False 11762 1726853308.10468: Set connection var ansible_connection to ssh 11762 1726853308.10496: variable 'ansible_shell_executable' from source: unknown 11762 1726853308.10676: variable 'ansible_connection' from source: unknown 11762 1726853308.10679: variable 'ansible_module_compression' from source: unknown 11762 1726853308.10681: variable 'ansible_shell_type' from source: unknown 11762 1726853308.10683: variable 'ansible_shell_executable' from source: unknown 11762 1726853308.10685: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.10687: variable 'ansible_pipelining' from source: unknown 11762 1726853308.10689: variable 'ansible_timeout' from source: unknown 11762 1726853308.10691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.10694: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853308.10696: variable 'omit' from source: magic vars 11762 1726853308.10698: starting attempt loop 11762 1726853308.10700: running the handler 11762 1726853308.10701: _low_level_execute_command(): starting 11762 1726853308.10703: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853308.11432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853308.11451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853308.11482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853308.11494: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853308.11589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853308.11604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853308.11625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853308.11735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853308.13448: stdout chunk (state=3): >>>/root <<< 11762 1726853308.13584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853308.13602: stderr chunk (state=3): >>><<< 11762 1726853308.13613: stdout chunk (state=3): >>><<< 11762 1726853308.13647: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853308.13667: _low_level_execute_command(): starting 11762 1726853308.13679: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628 `" && echo ansible-tmp-1726853308.1365535-14544-140143948853628="` echo /root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628 `" ) && sleep 0' 11762 1726853308.14322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853308.14336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853308.14353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853308.14375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853308.14394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853308.14439: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853308.14502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853308.14517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853308.14541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853308.14649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853308.16624: stdout chunk (state=3): >>>ansible-tmp-1726853308.1365535-14544-140143948853628=/root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628 <<< 11762 1726853308.16785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853308.16789: stdout chunk (state=3): >>><<< 11762 1726853308.16791: stderr chunk (state=3): >>><<< 11762 1726853308.16976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853308.1365535-14544-140143948853628=/root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853308.16979: variable 'ansible_module_compression' from source: unknown 11762 1726853308.16982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853308.16984: variable 'ansible_facts' from source: unknown 11762 1726853308.17035: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628/AnsiballZ_command.py 11762 1726853308.17223: Sending initial data 11762 1726853308.17232: Sent initial data (156 bytes) 11762 1726853308.17883: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853308.17985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853308.18007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853308.18024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853308.18046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853308.18145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853308.19767: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853308.19852: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853308.19947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpbcch6thb /root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628/AnsiballZ_command.py <<< 11762 1726853308.19959: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628/AnsiballZ_command.py" <<< 11762 1726853308.20009: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpbcch6thb" to remote "/root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628/AnsiballZ_command.py" <<< 11762 1726853308.20889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853308.20979: stderr chunk (state=3): >>><<< 11762 1726853308.20984: stdout chunk (state=3): >>><<< 11762 1726853308.21035: done transferring module to remote 11762 1726853308.21048: _low_level_execute_command(): starting 11762 1726853308.21053: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628/ /root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628/AnsiballZ_command.py && sleep 0' 11762 1726853308.21690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853308.21700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853308.21738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853308.21787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853308.21815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853308.21849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853308.21852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853308.21935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853308.23861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853308.23864: stdout chunk (state=3): >>><<< 11762 1726853308.23875: stderr chunk (state=3): >>><<< 11762 1726853308.24097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853308.24101: _low_level_execute_command(): starting 11762 1726853308.24104: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628/AnsiballZ_command.py && sleep 0' 11762 1726853308.24475: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853308.24490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853308.24501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853308.24515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853308.24528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853308.24534: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853308.24551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853308.24557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853308.24565: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853308.24574: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853308.24583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853308.24593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853308.24605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853308.24612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853308.24620: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853308.24629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853308.24696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853308.24707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853308.24724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853308.24819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853308.40954: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::c4/128 scope global dynamic noprefixroute \n valid_lft 238sec preferred_lft 238sec\n inet6 2001:db8::1cc9:1bff:fec3:dbff/64 scope global dynamic noprefixroute \n valid_lft 1797sec preferred_lft 1797sec\n inet6 fe80::1cc9:1bff:fec3:dbff/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:28.404862", "end": "2024-09-20 13:28:28.408602", "delta": "0:00:00.003740", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853308.42636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853308.42659: stderr chunk (state=3): >>><<< 11762 1726853308.42663: stdout chunk (state=3): >>><<< 11762 1726853308.42679: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::c4/128 scope global dynamic noprefixroute \n valid_lft 238sec preferred_lft 238sec\n inet6 2001:db8::1cc9:1bff:fec3:dbff/64 scope global dynamic noprefixroute \n valid_lft 1797sec preferred_lft 1797sec\n inet6 fe80::1cc9:1bff:fec3:dbff/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:28.404862", "end": "2024-09-20 13:28:28.408602", "delta": "0:00:00.003740", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853308.42712: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853308.42715: _low_level_execute_command(): starting 11762 1726853308.42722: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853308.1365535-14544-140143948853628/ > /dev/null 2>&1 && sleep 0' 11762 1726853308.43327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853308.43350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853308.43446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853308.45475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853308.45496: stderr chunk (state=3): >>><<< 11762 1726853308.45499: stdout chunk (state=3): >>><<< 11762 1726853308.45512: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853308.45518: handler run complete 11762 1726853308.45536: Evaluated conditional (False): False 11762 1726853308.45644: variable 'address' from source: include params 11762 1726853308.45651: variable 'result' from source: set_fact 11762 1726853308.45666: Evaluated conditional (address in result.stdout): True 11762 1726853308.45678: attempt loop complete, returning result 11762 1726853308.45681: _execute() done 11762 1726853308.45683: dumping result to json 11762 1726853308.45689: done dumping result, returning 11762 1726853308.45696: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 [02083763-bbaf-d845-03d0-000000000dc7] 11762 1726853308.45700: sending task result for task 02083763-bbaf-d845-03d0-000000000dc7 11762 1726853308.45791: done sending task result for task 02083763-bbaf-d845-03d0-000000000dc7 11762 1726853308.45794: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003740", "end": "2024-09-20 13:28:28.408602", "rc": 0, "start": "2024-09-20 13:28:28.404862" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::c4/128 scope global dynamic noprefixroute valid_lft 238sec preferred_lft 238sec inet6 2001:db8::1cc9:1bff:fec3:dbff/64 scope global dynamic noprefixroute valid_lft 1797sec preferred_lft 1797sec inet6 fe80::1cc9:1bff:fec3:dbff/64 scope link noprefixroute valid_lft forever preferred_lft forever 11762 1726853308.45867: no more pending results, returning what we have 11762 1726853308.45873: results queue empty 11762 1726853308.45874: checking for any_errors_fatal 11762 1726853308.45875: done checking for any_errors_fatal 11762 1726853308.45876: checking for max_fail_percentage 11762 1726853308.45878: done checking for max_fail_percentage 11762 1726853308.45879: checking to see if all hosts have failed and the running result is not ok 11762 1726853308.45879: done checking to see if all hosts have failed 11762 1726853308.45880: getting the remaining hosts for this loop 11762 1726853308.45882: done getting the remaining hosts for this loop 11762 1726853308.45885: getting the next task for host managed_node2 11762 1726853308.45894: done getting next task for host managed_node2 11762 1726853308.45897: ^ task is: TASK: Conditional asserts 11762 1726853308.45899: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853308.45906: getting variables 11762 1726853308.45907: in VariableManager get_vars() 11762 1726853308.45947: Calling all_inventory to load vars for managed_node2 11762 1726853308.45950: Calling groups_inventory to load vars for managed_node2 11762 1726853308.45952: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.45962: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.45965: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.45967: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.47230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.48181: done with get_vars() 11762 1726853308.48197: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 13:28:28 -0400 (0:00:00.425) 0:00:58.913 ****** 11762 1726853308.48267: entering _queue_task() for managed_node2/include_tasks 11762 1726853308.48515: worker is 1 (out of 1 available) 11762 1726853308.48528: exiting _queue_task() for managed_node2/include_tasks 11762 1726853308.48543: done queuing things up, now waiting for results queue to drain 11762 1726853308.48545: waiting for pending results... 11762 1726853308.48732: running TaskExecutor() for managed_node2/TASK: Conditional asserts 11762 1726853308.48812: in run() - task 02083763-bbaf-d845-03d0-0000000008f0 11762 1726853308.48823: variable 'ansible_search_path' from source: unknown 11762 1726853308.48826: variable 'ansible_search_path' from source: unknown 11762 1726853308.49043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853308.50531: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853308.50586: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853308.50617: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853308.50641: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853308.50665: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853308.50727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853308.50751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853308.50768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853308.50795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853308.50806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853308.50921: dumping result to json 11762 1726853308.50924: done dumping result, returning 11762 1726853308.50929: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [02083763-bbaf-d845-03d0-0000000008f0] 11762 1726853308.50941: sending task result for task 02083763-bbaf-d845-03d0-0000000008f0 11762 1726853308.51033: done sending task result for task 02083763-bbaf-d845-03d0-0000000008f0 11762 1726853308.51036: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 11762 1726853308.51087: no more pending results, returning what we have 11762 1726853308.51092: results queue empty 11762 1726853308.51092: checking for any_errors_fatal 11762 1726853308.51101: done checking for any_errors_fatal 11762 1726853308.51102: checking for max_fail_percentage 11762 1726853308.51103: done checking for max_fail_percentage 11762 1726853308.51104: checking to see if all hosts have failed and the running result is not ok 11762 1726853308.51105: done checking to see if all hosts have failed 11762 1726853308.51106: getting the remaining hosts for this loop 11762 1726853308.51107: done getting the remaining hosts for this loop 11762 1726853308.51111: getting the next task for host managed_node2 11762 1726853308.51118: done getting next task for host managed_node2 11762 1726853308.51120: ^ task is: TASK: Success in test '{{ lsr_description }}' 11762 1726853308.51123: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853308.51127: getting variables 11762 1726853308.51128: in VariableManager get_vars() 11762 1726853308.51177: Calling all_inventory to load vars for managed_node2 11762 1726853308.51180: Calling groups_inventory to load vars for managed_node2 11762 1726853308.51182: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.51192: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.51195: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.51197: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.51982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.52828: done with get_vars() 11762 1726853308.52843: done getting variables 11762 1726853308.52887: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853308.52974: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 13:28:28 -0400 (0:00:00.047) 0:00:58.960 ****** 11762 1726853308.52997: entering _queue_task() for managed_node2/debug 11762 1726853308.53239: worker is 1 (out of 1 available) 11762 1726853308.53252: exiting _queue_task() for managed_node2/debug 11762 1726853308.53265: done queuing things up, now waiting for results queue to drain 11762 1726853308.53267: waiting for pending results... 11762 1726853308.53451: running TaskExecutor() for managed_node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 11762 1726853308.53517: in run() - task 02083763-bbaf-d845-03d0-0000000008f1 11762 1726853308.53530: variable 'ansible_search_path' from source: unknown 11762 1726853308.53533: variable 'ansible_search_path' from source: unknown 11762 1726853308.53563: calling self._execute() 11762 1726853308.53643: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.53651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.53661: variable 'omit' from source: magic vars 11762 1726853308.53927: variable 'ansible_distribution_major_version' from source: facts 11762 1726853308.53939: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853308.53942: variable 'omit' from source: magic vars 11762 1726853308.53973: variable 'omit' from source: magic vars 11762 1726853308.54040: variable 'lsr_description' from source: include params 11762 1726853308.54057: variable 'omit' from source: magic vars 11762 1726853308.54091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853308.54118: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853308.54133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853308.54148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853308.54159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853308.54189: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853308.54192: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.54195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.54263: Set connection var ansible_timeout to 10 11762 1726853308.54266: Set connection var ansible_shell_type to sh 11762 1726853308.54275: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853308.54279: Set connection var ansible_shell_executable to /bin/sh 11762 1726853308.54287: Set connection var ansible_pipelining to False 11762 1726853308.54293: Set connection var ansible_connection to ssh 11762 1726853308.54309: variable 'ansible_shell_executable' from source: unknown 11762 1726853308.54312: variable 'ansible_connection' from source: unknown 11762 1726853308.54315: variable 'ansible_module_compression' from source: unknown 11762 1726853308.54317: variable 'ansible_shell_type' from source: unknown 11762 1726853308.54319: variable 'ansible_shell_executable' from source: unknown 11762 1726853308.54321: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.54325: variable 'ansible_pipelining' from source: unknown 11762 1726853308.54328: variable 'ansible_timeout' from source: unknown 11762 1726853308.54332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.54432: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853308.54441: variable 'omit' from source: magic vars 11762 1726853308.54447: starting attempt loop 11762 1726853308.54450: running the handler 11762 1726853308.54492: handler run complete 11762 1726853308.54504: attempt loop complete, returning result 11762 1726853308.54507: _execute() done 11762 1726853308.54509: dumping result to json 11762 1726853308.54512: done dumping result, returning 11762 1726853308.54519: done running TaskExecutor() for managed_node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [02083763-bbaf-d845-03d0-0000000008f1] 11762 1726853308.54524: sending task result for task 02083763-bbaf-d845-03d0-0000000008f1 11762 1726853308.54602: done sending task result for task 02083763-bbaf-d845-03d0-0000000008f1 11762 1726853308.54606: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 11762 1726853308.54658: no more pending results, returning what we have 11762 1726853308.54661: results queue empty 11762 1726853308.54662: checking for any_errors_fatal 11762 1726853308.54670: done checking for any_errors_fatal 11762 1726853308.54672: checking for max_fail_percentage 11762 1726853308.54675: done checking for max_fail_percentage 11762 1726853308.54676: checking to see if all hosts have failed and the running result is not ok 11762 1726853308.54677: done checking to see if all hosts have failed 11762 1726853308.54677: getting the remaining hosts for this loop 11762 1726853308.54679: done getting the remaining hosts for this loop 11762 1726853308.54682: getting the next task for host managed_node2 11762 1726853308.54689: done getting next task for host managed_node2 11762 1726853308.54691: ^ task is: TASK: Cleanup 11762 1726853308.54694: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853308.54699: getting variables 11762 1726853308.54700: in VariableManager get_vars() 11762 1726853308.54748: Calling all_inventory to load vars for managed_node2 11762 1726853308.54751: Calling groups_inventory to load vars for managed_node2 11762 1726853308.54753: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.54762: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.54764: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.54766: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.55647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.56500: done with get_vars() 11762 1726853308.56514: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 13:28:28 -0400 (0:00:00.035) 0:00:58.996 ****** 11762 1726853308.56583: entering _queue_task() for managed_node2/include_tasks 11762 1726853308.56816: worker is 1 (out of 1 available) 11762 1726853308.56830: exiting _queue_task() for managed_node2/include_tasks 11762 1726853308.56849: done queuing things up, now waiting for results queue to drain 11762 1726853308.56851: waiting for pending results... 11762 1726853308.57025: running TaskExecutor() for managed_node2/TASK: Cleanup 11762 1726853308.57093: in run() - task 02083763-bbaf-d845-03d0-0000000008f5 11762 1726853308.57106: variable 'ansible_search_path' from source: unknown 11762 1726853308.57109: variable 'ansible_search_path' from source: unknown 11762 1726853308.57145: variable 'lsr_cleanup' from source: include params 11762 1726853308.57290: variable 'lsr_cleanup' from source: include params 11762 1726853308.57347: variable 'omit' from source: magic vars 11762 1726853308.57449: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.57455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.57465: variable 'omit' from source: magic vars 11762 1726853308.57631: variable 'ansible_distribution_major_version' from source: facts 11762 1726853308.57639: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853308.57648: variable 'item' from source: unknown 11762 1726853308.57691: variable 'item' from source: unknown 11762 1726853308.57713: variable 'item' from source: unknown 11762 1726853308.57757: variable 'item' from source: unknown 11762 1726853308.57883: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.57887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.57889: variable 'omit' from source: magic vars 11762 1726853308.57961: variable 'ansible_distribution_major_version' from source: facts 11762 1726853308.57965: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853308.57970: variable 'item' from source: unknown 11762 1726853308.58016: variable 'item' from source: unknown 11762 1726853308.58037: variable 'item' from source: unknown 11762 1726853308.58079: variable 'item' from source: unknown 11762 1726853308.58140: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.58153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.58157: variable 'omit' from source: magic vars 11762 1726853308.58255: variable 'ansible_distribution_major_version' from source: facts 11762 1726853308.58259: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853308.58269: variable 'item' from source: unknown 11762 1726853308.58308: variable 'item' from source: unknown 11762 1726853308.58326: variable 'item' from source: unknown 11762 1726853308.58366: variable 'item' from source: unknown 11762 1726853308.58425: dumping result to json 11762 1726853308.58428: done dumping result, returning 11762 1726853308.58430: done running TaskExecutor() for managed_node2/TASK: Cleanup [02083763-bbaf-d845-03d0-0000000008f5] 11762 1726853308.58432: sending task result for task 02083763-bbaf-d845-03d0-0000000008f5 11762 1726853308.58466: done sending task result for task 02083763-bbaf-d845-03d0-0000000008f5 11762 1726853308.58468: WORKER PROCESS EXITING 11762 1726853308.58502: no more pending results, returning what we have 11762 1726853308.58507: in VariableManager get_vars() 11762 1726853308.58556: Calling all_inventory to load vars for managed_node2 11762 1726853308.58558: Calling groups_inventory to load vars for managed_node2 11762 1726853308.58560: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.58574: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.58577: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.58579: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.59359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.60209: done with get_vars() 11762 1726853308.60223: variable 'ansible_search_path' from source: unknown 11762 1726853308.60224: variable 'ansible_search_path' from source: unknown 11762 1726853308.60253: variable 'ansible_search_path' from source: unknown 11762 1726853308.60254: variable 'ansible_search_path' from source: unknown 11762 1726853308.60273: variable 'ansible_search_path' from source: unknown 11762 1726853308.60274: variable 'ansible_search_path' from source: unknown 11762 1726853308.60289: we have included files to process 11762 1726853308.60290: generating all_blocks data 11762 1726853308.60291: done generating all_blocks data 11762 1726853308.60294: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11762 1726853308.60295: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11762 1726853308.60297: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11762 1726853308.60392: in VariableManager get_vars() 11762 1726853308.60409: done with get_vars() 11762 1726853308.60413: variable 'omit' from source: magic vars 11762 1726853308.60438: variable 'omit' from source: magic vars 11762 1726853308.60473: in VariableManager get_vars() 11762 1726853308.60484: done with get_vars() 11762 1726853308.60501: in VariableManager get_vars() 11762 1726853308.60513: done with get_vars() 11762 1726853308.60538: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11762 1726853308.60606: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11762 1726853308.60690: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11762 1726853308.60909: in VariableManager get_vars() 11762 1726853308.60924: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11762 1726853308.62202: done processing included file 11762 1726853308.62203: iterating over new_blocks loaded from include file 11762 1726853308.62204: in VariableManager get_vars() 11762 1726853308.62217: done with get_vars() 11762 1726853308.62218: filtering new block on tags 11762 1726853308.62463: done filtering new block on tags 11762 1726853308.62466: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed_node2 => (item=tasks/cleanup_bond_profile+device.yml) 11762 1726853308.62470: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11762 1726853308.62470: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11762 1726853308.62475: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11762 1726853308.62688: done processing included file 11762 1726853308.62689: iterating over new_blocks loaded from include file 11762 1726853308.62690: in VariableManager get_vars() 11762 1726853308.62702: done with get_vars() 11762 1726853308.62703: filtering new block on tags 11762 1726853308.62724: done filtering new block on tags 11762 1726853308.62726: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed_node2 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 11762 1726853308.66767: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11762 1726853308.66775: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11762 1726853308.66779: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11762 1726853308.67129: done processing included file 11762 1726853308.67132: iterating over new_blocks loaded from include file 11762 1726853308.67133: in VariableManager get_vars() 11762 1726853308.67159: done with get_vars() 11762 1726853308.67162: filtering new block on tags 11762 1726853308.67194: done filtering new block on tags 11762 1726853308.67196: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 => (item=tasks/check_network_dns.yml) 11762 1726853308.67200: extending task lists for all hosts with included blocks 11762 1726853308.69285: done extending task lists 11762 1726853308.69287: done processing included files 11762 1726853308.69287: results queue empty 11762 1726853308.69288: checking for any_errors_fatal 11762 1726853308.69290: done checking for any_errors_fatal 11762 1726853308.69291: checking for max_fail_percentage 11762 1726853308.69291: done checking for max_fail_percentage 11762 1726853308.69292: checking to see if all hosts have failed and the running result is not ok 11762 1726853308.69292: done checking to see if all hosts have failed 11762 1726853308.69293: getting the remaining hosts for this loop 11762 1726853308.69294: done getting the remaining hosts for this loop 11762 1726853308.69295: getting the next task for host managed_node2 11762 1726853308.69298: done getting next task for host managed_node2 11762 1726853308.69300: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11762 1726853308.69302: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853308.69309: getting variables 11762 1726853308.69310: in VariableManager get_vars() 11762 1726853308.69322: Calling all_inventory to load vars for managed_node2 11762 1726853308.69323: Calling groups_inventory to load vars for managed_node2 11762 1726853308.69325: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.69328: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.69329: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.69331: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.70120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.71691: done with get_vars() 11762 1726853308.71711: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:28:28 -0400 (0:00:00.152) 0:00:59.148 ****** 11762 1726853308.71792: entering _queue_task() for managed_node2/include_tasks 11762 1726853308.72146: worker is 1 (out of 1 available) 11762 1726853308.72160: exiting _queue_task() for managed_node2/include_tasks 11762 1726853308.72176: done queuing things up, now waiting for results queue to drain 11762 1726853308.72178: waiting for pending results... 11762 1726853308.72687: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11762 1726853308.72693: in run() - task 02083763-bbaf-d845-03d0-000000000e0a 11762 1726853308.72697: variable 'ansible_search_path' from source: unknown 11762 1726853308.72699: variable 'ansible_search_path' from source: unknown 11762 1726853308.72729: calling self._execute() 11762 1726853308.72831: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.72845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.72976: variable 'omit' from source: magic vars 11762 1726853308.73234: variable 'ansible_distribution_major_version' from source: facts 11762 1726853308.73254: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853308.73265: _execute() done 11762 1726853308.73274: dumping result to json 11762 1726853308.73282: done dumping result, returning 11762 1726853308.73292: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-d845-03d0-000000000e0a] 11762 1726853308.73302: sending task result for task 02083763-bbaf-d845-03d0-000000000e0a 11762 1726853308.73449: no more pending results, returning what we have 11762 1726853308.73454: in VariableManager get_vars() 11762 1726853308.73709: Calling all_inventory to load vars for managed_node2 11762 1726853308.73712: Calling groups_inventory to load vars for managed_node2 11762 1726853308.73714: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.73723: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.73726: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.73730: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.74249: done sending task result for task 02083763-bbaf-d845-03d0-000000000e0a 11762 1726853308.74253: WORKER PROCESS EXITING 11762 1726853308.75114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.76673: done with get_vars() 11762 1726853308.76695: variable 'ansible_search_path' from source: unknown 11762 1726853308.76697: variable 'ansible_search_path' from source: unknown 11762 1726853308.76738: we have included files to process 11762 1726853308.76739: generating all_blocks data 11762 1726853308.76741: done generating all_blocks data 11762 1726853308.76745: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853308.76747: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853308.76750: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11762 1726853308.77353: done processing included file 11762 1726853308.77355: iterating over new_blocks loaded from include file 11762 1726853308.77356: in VariableManager get_vars() 11762 1726853308.77389: done with get_vars() 11762 1726853308.77391: filtering new block on tags 11762 1726853308.77428: done filtering new block on tags 11762 1726853308.77431: in VariableManager get_vars() 11762 1726853308.77463: done with get_vars() 11762 1726853308.77465: filtering new block on tags 11762 1726853308.77514: done filtering new block on tags 11762 1726853308.77516: in VariableManager get_vars() 11762 1726853308.77549: done with get_vars() 11762 1726853308.77551: filtering new block on tags 11762 1726853308.77598: done filtering new block on tags 11762 1726853308.77600: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 11762 1726853308.77606: extending task lists for all hosts with included blocks 11762 1726853308.79208: done extending task lists 11762 1726853308.79209: done processing included files 11762 1726853308.79210: results queue empty 11762 1726853308.79210: checking for any_errors_fatal 11762 1726853308.79214: done checking for any_errors_fatal 11762 1726853308.79214: checking for max_fail_percentage 11762 1726853308.79215: done checking for max_fail_percentage 11762 1726853308.79216: checking to see if all hosts have failed and the running result is not ok 11762 1726853308.79217: done checking to see if all hosts have failed 11762 1726853308.79217: getting the remaining hosts for this loop 11762 1726853308.79219: done getting the remaining hosts for this loop 11762 1726853308.79220: getting the next task for host managed_node2 11762 1726853308.79223: done getting next task for host managed_node2 11762 1726853308.79225: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11762 1726853308.79228: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853308.79237: getting variables 11762 1726853308.79237: in VariableManager get_vars() 11762 1726853308.79252: Calling all_inventory to load vars for managed_node2 11762 1726853308.79253: Calling groups_inventory to load vars for managed_node2 11762 1726853308.79255: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.79259: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.79260: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.79262: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.79900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.80870: done with get_vars() 11762 1726853308.80891: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:28:28 -0400 (0:00:00.091) 0:00:59.240 ****** 11762 1726853308.80970: entering _queue_task() for managed_node2/setup 11762 1726853308.81332: worker is 1 (out of 1 available) 11762 1726853308.81345: exiting _queue_task() for managed_node2/setup 11762 1726853308.81358: done queuing things up, now waiting for results queue to drain 11762 1726853308.81359: waiting for pending results... 11762 1726853308.81692: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11762 1726853308.81826: in run() - task 02083763-bbaf-d845-03d0-000000000fde 11762 1726853308.81876: variable 'ansible_search_path' from source: unknown 11762 1726853308.81879: variable 'ansible_search_path' from source: unknown 11762 1726853308.81895: calling self._execute() 11762 1726853308.82000: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.82013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.82035: variable 'omit' from source: magic vars 11762 1726853308.82324: variable 'ansible_distribution_major_version' from source: facts 11762 1726853308.82334: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853308.82490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853308.83975: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853308.84041: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853308.84067: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853308.84115: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853308.84177: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853308.84225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853308.84259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853308.84292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853308.84476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853308.84479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853308.84481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853308.84484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853308.84486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853308.84498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853308.84514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853308.84670: variable '__network_required_facts' from source: role '' defaults 11762 1726853308.84691: variable 'ansible_facts' from source: unknown 11762 1726853308.85409: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11762 1726853308.85418: when evaluation is False, skipping this task 11762 1726853308.85427: _execute() done 11762 1726853308.85434: dumping result to json 11762 1726853308.85441: done dumping result, returning 11762 1726853308.85453: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-d845-03d0-000000000fde] 11762 1726853308.85464: sending task result for task 02083763-bbaf-d845-03d0-000000000fde 11762 1726853308.85574: done sending task result for task 02083763-bbaf-d845-03d0-000000000fde 11762 1726853308.85583: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853308.85646: no more pending results, returning what we have 11762 1726853308.85651: results queue empty 11762 1726853308.85652: checking for any_errors_fatal 11762 1726853308.85653: done checking for any_errors_fatal 11762 1726853308.85654: checking for max_fail_percentage 11762 1726853308.85656: done checking for max_fail_percentage 11762 1726853308.85657: checking to see if all hosts have failed and the running result is not ok 11762 1726853308.85657: done checking to see if all hosts have failed 11762 1726853308.85658: getting the remaining hosts for this loop 11762 1726853308.85660: done getting the remaining hosts for this loop 11762 1726853308.85663: getting the next task for host managed_node2 11762 1726853308.85676: done getting next task for host managed_node2 11762 1726853308.85679: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11762 1726853308.85685: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853308.85733: getting variables 11762 1726853308.85735: in VariableManager get_vars() 11762 1726853308.85787: Calling all_inventory to load vars for managed_node2 11762 1726853308.85790: Calling groups_inventory to load vars for managed_node2 11762 1726853308.85792: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.85801: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.85803: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.85811: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.86698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.87568: done with get_vars() 11762 1726853308.87586: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:28:28 -0400 (0:00:00.067) 0:00:59.307 ****** 11762 1726853308.87687: entering _queue_task() for managed_node2/stat 11762 1726853308.88005: worker is 1 (out of 1 available) 11762 1726853308.88018: exiting _queue_task() for managed_node2/stat 11762 1726853308.88031: done queuing things up, now waiting for results queue to drain 11762 1726853308.88032: waiting for pending results... 11762 1726853308.88402: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 11762 1726853308.88515: in run() - task 02083763-bbaf-d845-03d0-000000000fe0 11762 1726853308.88537: variable 'ansible_search_path' from source: unknown 11762 1726853308.88544: variable 'ansible_search_path' from source: unknown 11762 1726853308.88591: calling self._execute() 11762 1726853308.88697: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.88710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.88728: variable 'omit' from source: magic vars 11762 1726853308.89105: variable 'ansible_distribution_major_version' from source: facts 11762 1726853308.89116: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853308.89263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853308.89676: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853308.89680: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853308.89682: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853308.89685: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853308.89700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853308.89732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853308.89779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853308.89818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853308.89935: variable '__network_is_ostree' from source: set_fact 11762 1726853308.89949: Evaluated conditional (not __network_is_ostree is defined): False 11762 1726853308.90024: when evaluation is False, skipping this task 11762 1726853308.90027: _execute() done 11762 1726853308.90029: dumping result to json 11762 1726853308.90032: done dumping result, returning 11762 1726853308.90034: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-d845-03d0-000000000fe0] 11762 1726853308.90036: sending task result for task 02083763-bbaf-d845-03d0-000000000fe0 11762 1726853308.90105: done sending task result for task 02083763-bbaf-d845-03d0-000000000fe0 11762 1726853308.90108: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11762 1726853308.90167: no more pending results, returning what we have 11762 1726853308.90175: results queue empty 11762 1726853308.90176: checking for any_errors_fatal 11762 1726853308.90184: done checking for any_errors_fatal 11762 1726853308.90185: checking for max_fail_percentage 11762 1726853308.90188: done checking for max_fail_percentage 11762 1726853308.90189: checking to see if all hosts have failed and the running result is not ok 11762 1726853308.90190: done checking to see if all hosts have failed 11762 1726853308.90190: getting the remaining hosts for this loop 11762 1726853308.90193: done getting the remaining hosts for this loop 11762 1726853308.90197: getting the next task for host managed_node2 11762 1726853308.90205: done getting next task for host managed_node2 11762 1726853308.90208: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11762 1726853308.90214: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853308.90359: getting variables 11762 1726853308.90362: in VariableManager get_vars() 11762 1726853308.90417: Calling all_inventory to load vars for managed_node2 11762 1726853308.90421: Calling groups_inventory to load vars for managed_node2 11762 1726853308.90423: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.90433: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.90436: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.90438: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.91449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.92335: done with get_vars() 11762 1726853308.92360: done getting variables 11762 1726853308.92424: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:28:28 -0400 (0:00:00.047) 0:00:59.355 ****** 11762 1726853308.92465: entering _queue_task() for managed_node2/set_fact 11762 1726853308.92826: worker is 1 (out of 1 available) 11762 1726853308.92840: exiting _queue_task() for managed_node2/set_fact 11762 1726853308.92858: done queuing things up, now waiting for results queue to drain 11762 1726853308.92860: waiting for pending results... 11762 1726853308.93276: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11762 1726853308.93377: in run() - task 02083763-bbaf-d845-03d0-000000000fe1 11762 1726853308.93388: variable 'ansible_search_path' from source: unknown 11762 1726853308.93393: variable 'ansible_search_path' from source: unknown 11762 1726853308.93424: calling self._execute() 11762 1726853308.93502: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.93508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.93517: variable 'omit' from source: magic vars 11762 1726853308.93814: variable 'ansible_distribution_major_version' from source: facts 11762 1726853308.93826: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853308.93946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853308.94152: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853308.94187: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853308.94214: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853308.94242: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853308.94309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853308.94327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853308.94348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853308.94368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853308.94436: variable '__network_is_ostree' from source: set_fact 11762 1726853308.94441: Evaluated conditional (not __network_is_ostree is defined): False 11762 1726853308.94443: when evaluation is False, skipping this task 11762 1726853308.94452: _execute() done 11762 1726853308.94454: dumping result to json 11762 1726853308.94457: done dumping result, returning 11762 1726853308.94463: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-d845-03d0-000000000fe1] 11762 1726853308.94472: sending task result for task 02083763-bbaf-d845-03d0-000000000fe1 11762 1726853308.94550: done sending task result for task 02083763-bbaf-d845-03d0-000000000fe1 11762 1726853308.94553: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11762 1726853308.94642: no more pending results, returning what we have 11762 1726853308.94646: results queue empty 11762 1726853308.94647: checking for any_errors_fatal 11762 1726853308.94652: done checking for any_errors_fatal 11762 1726853308.94652: checking for max_fail_percentage 11762 1726853308.94654: done checking for max_fail_percentage 11762 1726853308.94655: checking to see if all hosts have failed and the running result is not ok 11762 1726853308.94656: done checking to see if all hosts have failed 11762 1726853308.94656: getting the remaining hosts for this loop 11762 1726853308.94658: done getting the remaining hosts for this loop 11762 1726853308.94662: getting the next task for host managed_node2 11762 1726853308.94674: done getting next task for host managed_node2 11762 1726853308.94677: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11762 1726853308.94683: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853308.94702: getting variables 11762 1726853308.94703: in VariableManager get_vars() 11762 1726853308.94741: Calling all_inventory to load vars for managed_node2 11762 1726853308.94744: Calling groups_inventory to load vars for managed_node2 11762 1726853308.94746: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853308.94754: Calling all_plugins_play to load vars for managed_node2 11762 1726853308.94756: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853308.94758: Calling groups_plugins_play to load vars for managed_node2 11762 1726853308.95540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853308.96415: done with get_vars() 11762 1726853308.96431: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:28:28 -0400 (0:00:00.040) 0:00:59.395 ****** 11762 1726853308.96504: entering _queue_task() for managed_node2/service_facts 11762 1726853308.96761: worker is 1 (out of 1 available) 11762 1726853308.96778: exiting _queue_task() for managed_node2/service_facts 11762 1726853308.96794: done queuing things up, now waiting for results queue to drain 11762 1726853308.96796: waiting for pending results... 11762 1726853308.96986: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 11762 1726853308.97088: in run() - task 02083763-bbaf-d845-03d0-000000000fe3 11762 1726853308.97100: variable 'ansible_search_path' from source: unknown 11762 1726853308.97104: variable 'ansible_search_path' from source: unknown 11762 1726853308.97135: calling self._execute() 11762 1726853308.97209: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.97212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.97221: variable 'omit' from source: magic vars 11762 1726853308.97512: variable 'ansible_distribution_major_version' from source: facts 11762 1726853308.97522: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853308.97528: variable 'omit' from source: magic vars 11762 1726853308.97595: variable 'omit' from source: magic vars 11762 1726853308.97618: variable 'omit' from source: magic vars 11762 1726853308.97652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853308.97690: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853308.97707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853308.97720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853308.97730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853308.97756: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853308.97759: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.97762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.97835: Set connection var ansible_timeout to 10 11762 1726853308.97838: Set connection var ansible_shell_type to sh 11762 1726853308.97840: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853308.97849: Set connection var ansible_shell_executable to /bin/sh 11762 1726853308.97855: Set connection var ansible_pipelining to False 11762 1726853308.97861: Set connection var ansible_connection to ssh 11762 1726853308.97880: variable 'ansible_shell_executable' from source: unknown 11762 1726853308.97883: variable 'ansible_connection' from source: unknown 11762 1726853308.97888: variable 'ansible_module_compression' from source: unknown 11762 1726853308.97890: variable 'ansible_shell_type' from source: unknown 11762 1726853308.97893: variable 'ansible_shell_executable' from source: unknown 11762 1726853308.97895: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853308.97899: variable 'ansible_pipelining' from source: unknown 11762 1726853308.97902: variable 'ansible_timeout' from source: unknown 11762 1726853308.97905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853308.98052: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853308.98062: variable 'omit' from source: magic vars 11762 1726853308.98067: starting attempt loop 11762 1726853308.98070: running the handler 11762 1726853308.98083: _low_level_execute_command(): starting 11762 1726853308.98092: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853308.98614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853308.98618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853308.98622: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853308.98625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853308.98669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853308.98674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853308.98676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853308.98760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853309.00479: stdout chunk (state=3): >>>/root <<< 11762 1726853309.00574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853309.00603: stderr chunk (state=3): >>><<< 11762 1726853309.00607: stdout chunk (state=3): >>><<< 11762 1726853309.00627: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853309.00639: _low_level_execute_command(): starting 11762 1726853309.00648: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540 `" && echo ansible-tmp-1726853309.0062814-14581-220131654015540="` echo /root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540 `" ) && sleep 0' 11762 1726853309.01107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853309.01111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853309.01113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853309.01124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853309.01126: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853309.01175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853309.01181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853309.01183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853309.01254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853309.03222: stdout chunk (state=3): >>>ansible-tmp-1726853309.0062814-14581-220131654015540=/root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540 <<< 11762 1726853309.03325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853309.03361: stderr chunk (state=3): >>><<< 11762 1726853309.03364: stdout chunk (state=3): >>><<< 11762 1726853309.03378: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853309.0062814-14581-220131654015540=/root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853309.03477: variable 'ansible_module_compression' from source: unknown 11762 1726853309.03481: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11762 1726853309.03498: variable 'ansible_facts' from source: unknown 11762 1726853309.03556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540/AnsiballZ_service_facts.py 11762 1726853309.03662: Sending initial data 11762 1726853309.03666: Sent initial data (162 bytes) 11762 1726853309.04124: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853309.04127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853309.04130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853309.04132: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853309.04135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853309.04191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853309.04195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853309.04199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853309.04269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853309.05905: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11762 1726853309.05912: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853309.05980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853309.06049: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpzstymxs0 /root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540/AnsiballZ_service_facts.py <<< 11762 1726853309.06052: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540/AnsiballZ_service_facts.py" <<< 11762 1726853309.06116: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpzstymxs0" to remote "/root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540/AnsiballZ_service_facts.py" <<< 11762 1726853309.06121: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540/AnsiballZ_service_facts.py" <<< 11762 1726853309.06762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853309.06804: stderr chunk (state=3): >>><<< 11762 1726853309.06808: stdout chunk (state=3): >>><<< 11762 1726853309.06869: done transferring module to remote 11762 1726853309.06880: _low_level_execute_command(): starting 11762 1726853309.06884: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540/ /root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540/AnsiballZ_service_facts.py && sleep 0' 11762 1726853309.07319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853309.07322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853309.07325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853309.07330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853309.07377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853309.07382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853309.07457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853309.09306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853309.09332: stderr chunk (state=3): >>><<< 11762 1726853309.09335: stdout chunk (state=3): >>><<< 11762 1726853309.09348: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853309.09351: _low_level_execute_command(): starting 11762 1726853309.09353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540/AnsiballZ_service_facts.py && sleep 0' 11762 1726853309.09782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853309.09786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853309.09788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853309.09790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853309.09792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853309.09839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853309.09842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853309.09925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853310.76397: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11762 1726853310.77853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853310.77902: stderr chunk (state=3): >>><<< 11762 1726853310.77905: stdout chunk (state=3): >>><<< 11762 1726853310.78085: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853310.80281: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853310.80300: _low_level_execute_command(): starting 11762 1726853310.80311: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853309.0062814-14581-220131654015540/ > /dev/null 2>&1 && sleep 0' 11762 1726853310.80990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853310.81006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853310.81101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853310.81141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853310.81160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853310.81186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853310.81303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853310.83358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853310.83362: stdout chunk (state=3): >>><<< 11762 1726853310.83365: stderr chunk (state=3): >>><<< 11762 1726853310.83381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853310.83654: handler run complete 11762 1726853310.83764: variable 'ansible_facts' from source: unknown 11762 1726853310.84133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853310.85277: variable 'ansible_facts' from source: unknown 11762 1726853310.85578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853310.85946: attempt loop complete, returning result 11762 1726853310.86042: _execute() done 11762 1726853310.86052: dumping result to json 11762 1726853310.86101: done dumping result, returning 11762 1726853310.86161: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-d845-03d0-000000000fe3] 11762 1726853310.86173: sending task result for task 02083763-bbaf-d845-03d0-000000000fe3 11762 1726853310.88547: done sending task result for task 02083763-bbaf-d845-03d0-000000000fe3 11762 1726853310.88550: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853310.88658: no more pending results, returning what we have 11762 1726853310.88661: results queue empty 11762 1726853310.88662: checking for any_errors_fatal 11762 1726853310.88664: done checking for any_errors_fatal 11762 1726853310.88665: checking for max_fail_percentage 11762 1726853310.88666: done checking for max_fail_percentage 11762 1726853310.88667: checking to see if all hosts have failed and the running result is not ok 11762 1726853310.88668: done checking to see if all hosts have failed 11762 1726853310.88668: getting the remaining hosts for this loop 11762 1726853310.88670: done getting the remaining hosts for this loop 11762 1726853310.88675: getting the next task for host managed_node2 11762 1726853310.88680: done getting next task for host managed_node2 11762 1726853310.88683: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11762 1726853310.88691: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853310.88702: getting variables 11762 1726853310.88703: in VariableManager get_vars() 11762 1726853310.88737: Calling all_inventory to load vars for managed_node2 11762 1726853310.88740: Calling groups_inventory to load vars for managed_node2 11762 1726853310.88742: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853310.88750: Calling all_plugins_play to load vars for managed_node2 11762 1726853310.88753: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853310.88755: Calling groups_plugins_play to load vars for managed_node2 11762 1726853310.91696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853310.95033: done with get_vars() 11762 1726853310.95068: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:28:30 -0400 (0:00:01.986) 0:01:01.382 ****** 11762 1726853310.95381: entering _queue_task() for managed_node2/package_facts 11762 1726853310.95921: worker is 1 (out of 1 available) 11762 1726853310.95935: exiting _queue_task() for managed_node2/package_facts 11762 1726853310.95947: done queuing things up, now waiting for results queue to drain 11762 1726853310.95948: waiting for pending results... 11762 1726853310.96494: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 11762 1726853310.96593: in run() - task 02083763-bbaf-d845-03d0-000000000fe4 11762 1726853310.96626: variable 'ansible_search_path' from source: unknown 11762 1726853310.96676: variable 'ansible_search_path' from source: unknown 11762 1726853310.96681: calling self._execute() 11762 1726853310.96788: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853310.96800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853310.96814: variable 'omit' from source: magic vars 11762 1726853310.97228: variable 'ansible_distribution_major_version' from source: facts 11762 1726853310.97248: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853310.97369: variable 'omit' from source: magic vars 11762 1726853310.97374: variable 'omit' from source: magic vars 11762 1726853310.97404: variable 'omit' from source: magic vars 11762 1726853310.97451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853310.97504: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853310.97530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853310.97554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853310.97573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853310.97613: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853310.97622: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853310.97629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853310.97736: Set connection var ansible_timeout to 10 11762 1726853310.97744: Set connection var ansible_shell_type to sh 11762 1726853310.97754: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853310.97803: Set connection var ansible_shell_executable to /bin/sh 11762 1726853310.97806: Set connection var ansible_pipelining to False 11762 1726853310.97813: Set connection var ansible_connection to ssh 11762 1726853310.97820: variable 'ansible_shell_executable' from source: unknown 11762 1726853310.97829: variable 'ansible_connection' from source: unknown 11762 1726853310.97837: variable 'ansible_module_compression' from source: unknown 11762 1726853310.97843: variable 'ansible_shell_type' from source: unknown 11762 1726853310.97851: variable 'ansible_shell_executable' from source: unknown 11762 1726853310.97857: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853310.97864: variable 'ansible_pipelining' from source: unknown 11762 1726853310.97912: variable 'ansible_timeout' from source: unknown 11762 1726853310.97919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853310.98097: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853310.98115: variable 'omit' from source: magic vars 11762 1726853310.98130: starting attempt loop 11762 1726853310.98141: running the handler 11762 1726853310.98158: _low_level_execute_command(): starting 11762 1726853310.98170: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853310.98895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853310.98910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853310.98930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853310.98947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853310.98964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853310.98990: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853310.99042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853310.99098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853310.99117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853310.99146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853310.99365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853311.01025: stdout chunk (state=3): >>>/root <<< 11762 1726853311.01126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853311.01189: stderr chunk (state=3): >>><<< 11762 1726853311.01417: stdout chunk (state=3): >>><<< 11762 1726853311.01424: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853311.01427: _low_level_execute_command(): starting 11762 1726853311.01430: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533 `" && echo ansible-tmp-1726853311.013391-14635-174949522911533="` echo /root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533 `" ) && sleep 0' 11762 1726853311.02539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853311.02600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853311.02777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853311.02890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853311.02995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853311.05036: stdout chunk (state=3): >>>ansible-tmp-1726853311.013391-14635-174949522911533=/root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533 <<< 11762 1726853311.05249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853311.05252: stdout chunk (state=3): >>><<< 11762 1726853311.05255: stderr chunk (state=3): >>><<< 11762 1726853311.05323: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853311.013391-14635-174949522911533=/root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853311.05678: variable 'ansible_module_compression' from source: unknown 11762 1726853311.05681: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11762 1726853311.05683: variable 'ansible_facts' from source: unknown 11762 1726853311.06076: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533/AnsiballZ_package_facts.py 11762 1726853311.06395: Sending initial data 11762 1726853311.06405: Sent initial data (161 bytes) 11762 1726853311.07686: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853311.07881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853311.07993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853311.09706: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11762 1726853311.09722: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853311.09776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853311.09846: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpx5rw_l0p /root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533/AnsiballZ_package_facts.py <<< 11762 1726853311.09860: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533/AnsiballZ_package_facts.py" <<< 11762 1726853311.09919: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpx5rw_l0p" to remote "/root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533/AnsiballZ_package_facts.py" <<< 11762 1726853311.09933: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533/AnsiballZ_package_facts.py" <<< 11762 1726853311.13516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853311.13531: stderr chunk (state=3): >>><<< 11762 1726853311.13539: stdout chunk (state=3): >>><<< 11762 1726853311.13574: done transferring module to remote 11762 1726853311.13769: _low_level_execute_command(): starting 11762 1726853311.13775: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533/ /root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533/AnsiballZ_package_facts.py && sleep 0' 11762 1726853311.14993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853311.15287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853311.17390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853311.17418: stderr chunk (state=3): >>><<< 11762 1726853311.17427: stdout chunk (state=3): >>><<< 11762 1726853311.17445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853311.17460: _low_level_execute_command(): starting 11762 1726853311.17657: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533/AnsiballZ_package_facts.py && sleep 0' 11762 1726853311.18954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853311.18957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853311.18960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853311.18979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853311.19008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853311.19088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853311.19209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853311.19221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853311.19232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853311.19333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853311.65291: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11762 1726853311.65345: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 11762 1726853311.65395: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "releas<<< 11762 1726853311.65432: stdout chunk (state=3): >>>e": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-D<<< 11762 1726853311.65474: stdout chunk (state=3): >>>igest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "ep<<< 11762 1726853311.65511: stdout chunk (state=3): >>>och": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11762 1726853311.67705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853311.67732: stdout chunk (state=3): >>><<< 11762 1726853311.67735: stderr chunk (state=3): >>><<< 11762 1726853311.67787: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853311.70456: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853311.70461: _low_level_execute_command(): starting 11762 1726853311.70463: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853311.013391-14635-174949522911533/ > /dev/null 2>&1 && sleep 0' 11762 1726853311.71325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853311.71330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853311.71489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853311.71492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853311.71512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853311.71792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853311.73877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853311.73881: stdout chunk (state=3): >>><<< 11762 1726853311.73883: stderr chunk (state=3): >>><<< 11762 1726853311.73886: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853311.73888: handler run complete 11762 1726853311.74805: variable 'ansible_facts' from source: unknown 11762 1726853311.75398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853311.77398: variable 'ansible_facts' from source: unknown 11762 1726853311.77886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853311.78618: attempt loop complete, returning result 11762 1726853311.78636: _execute() done 11762 1726853311.78726: dumping result to json 11762 1726853311.78878: done dumping result, returning 11762 1726853311.78892: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-d845-03d0-000000000fe4] 11762 1726853311.78903: sending task result for task 02083763-bbaf-d845-03d0-000000000fe4 11762 1726853311.81426: done sending task result for task 02083763-bbaf-d845-03d0-000000000fe4 11762 1726853311.81430: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853311.81597: no more pending results, returning what we have 11762 1726853311.81600: results queue empty 11762 1726853311.81601: checking for any_errors_fatal 11762 1726853311.81606: done checking for any_errors_fatal 11762 1726853311.81606: checking for max_fail_percentage 11762 1726853311.81608: done checking for max_fail_percentage 11762 1726853311.81609: checking to see if all hosts have failed and the running result is not ok 11762 1726853311.81610: done checking to see if all hosts have failed 11762 1726853311.81611: getting the remaining hosts for this loop 11762 1726853311.81612: done getting the remaining hosts for this loop 11762 1726853311.81615: getting the next task for host managed_node2 11762 1726853311.81622: done getting next task for host managed_node2 11762 1726853311.81632: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11762 1726853311.81637: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853311.81653: getting variables 11762 1726853311.81654: in VariableManager get_vars() 11762 1726853311.81693: Calling all_inventory to load vars for managed_node2 11762 1726853311.81696: Calling groups_inventory to load vars for managed_node2 11762 1726853311.81698: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853311.81707: Calling all_plugins_play to load vars for managed_node2 11762 1726853311.81709: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853311.81713: Calling groups_plugins_play to load vars for managed_node2 11762 1726853311.82990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853311.84807: done with get_vars() 11762 1726853311.84830: done getting variables 11762 1726853311.84894: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:28:31 -0400 (0:00:00.895) 0:01:02.279 ****** 11762 1726853311.84936: entering _queue_task() for managed_node2/debug 11762 1726853311.85294: worker is 1 (out of 1 available) 11762 1726853311.85307: exiting _queue_task() for managed_node2/debug 11762 1726853311.85353: done queuing things up, now waiting for results queue to drain 11762 1726853311.85355: waiting for pending results... 11762 1726853311.85663: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 11762 1726853311.85759: in run() - task 02083763-bbaf-d845-03d0-000000000e0b 11762 1726853311.85782: variable 'ansible_search_path' from source: unknown 11762 1726853311.85786: variable 'ansible_search_path' from source: unknown 11762 1726853311.85812: calling self._execute() 11762 1726853311.85906: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853311.85910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853311.85947: variable 'omit' from source: magic vars 11762 1726853311.86331: variable 'ansible_distribution_major_version' from source: facts 11762 1726853311.86334: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853311.86337: variable 'omit' from source: magic vars 11762 1726853311.86396: variable 'omit' from source: magic vars 11762 1726853311.86487: variable 'network_provider' from source: set_fact 11762 1726853311.86503: variable 'omit' from source: magic vars 11762 1726853311.86551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853311.86582: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853311.86646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853311.86650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853311.86656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853311.86660: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853311.86663: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853311.86665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853311.86754: Set connection var ansible_timeout to 10 11762 1726853311.86757: Set connection var ansible_shell_type to sh 11762 1726853311.86766: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853311.86769: Set connection var ansible_shell_executable to /bin/sh 11762 1726853311.86778: Set connection var ansible_pipelining to False 11762 1726853311.86785: Set connection var ansible_connection to ssh 11762 1726853311.86807: variable 'ansible_shell_executable' from source: unknown 11762 1726853311.86811: variable 'ansible_connection' from source: unknown 11762 1726853311.86814: variable 'ansible_module_compression' from source: unknown 11762 1726853311.86816: variable 'ansible_shell_type' from source: unknown 11762 1726853311.86818: variable 'ansible_shell_executable' from source: unknown 11762 1726853311.86820: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853311.86823: variable 'ansible_pipelining' from source: unknown 11762 1726853311.86863: variable 'ansible_timeout' from source: unknown 11762 1726853311.86866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853311.86969: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853311.86977: variable 'omit' from source: magic vars 11762 1726853311.86984: starting attempt loop 11762 1726853311.86987: running the handler 11762 1726853311.87099: handler run complete 11762 1726853311.87102: attempt loop complete, returning result 11762 1726853311.87104: _execute() done 11762 1726853311.87106: dumping result to json 11762 1726853311.87108: done dumping result, returning 11762 1726853311.87110: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-d845-03d0-000000000e0b] 11762 1726853311.87112: sending task result for task 02083763-bbaf-d845-03d0-000000000e0b 11762 1726853311.87212: done sending task result for task 02083763-bbaf-d845-03d0-000000000e0b ok: [managed_node2] => {} MSG: Using network provider: nm 11762 1726853311.87297: no more pending results, returning what we have 11762 1726853311.87300: results queue empty 11762 1726853311.87301: checking for any_errors_fatal 11762 1726853311.87351: done checking for any_errors_fatal 11762 1726853311.87353: checking for max_fail_percentage 11762 1726853311.87355: done checking for max_fail_percentage 11762 1726853311.87356: checking to see if all hosts have failed and the running result is not ok 11762 1726853311.87356: done checking to see if all hosts have failed 11762 1726853311.87357: getting the remaining hosts for this loop 11762 1726853311.87358: done getting the remaining hosts for this loop 11762 1726853311.87415: getting the next task for host managed_node2 11762 1726853311.87424: done getting next task for host managed_node2 11762 1726853311.87427: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11762 1726853311.87432: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853311.87442: WORKER PROCESS EXITING 11762 1726853311.87451: getting variables 11762 1726853311.87452: in VariableManager get_vars() 11762 1726853311.87493: Calling all_inventory to load vars for managed_node2 11762 1726853311.87496: Calling groups_inventory to load vars for managed_node2 11762 1726853311.87498: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853311.87506: Calling all_plugins_play to load vars for managed_node2 11762 1726853311.87509: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853311.87512: Calling groups_plugins_play to load vars for managed_node2 11762 1726853311.88793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853311.89924: done with get_vars() 11762 1726853311.89946: done getting variables 11762 1726853311.90000: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:28:31 -0400 (0:00:00.051) 0:01:02.330 ****** 11762 1726853311.90038: entering _queue_task() for managed_node2/fail 11762 1726853311.90336: worker is 1 (out of 1 available) 11762 1726853311.90349: exiting _queue_task() for managed_node2/fail 11762 1726853311.90361: done queuing things up, now waiting for results queue to drain 11762 1726853311.90363: waiting for pending results... 11762 1726853311.90655: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11762 1726853311.90976: in run() - task 02083763-bbaf-d845-03d0-000000000e0c 11762 1726853311.90980: variable 'ansible_search_path' from source: unknown 11762 1726853311.90982: variable 'ansible_search_path' from source: unknown 11762 1726853311.90984: calling self._execute() 11762 1726853311.90986: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853311.90988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853311.90991: variable 'omit' from source: magic vars 11762 1726853311.91359: variable 'ansible_distribution_major_version' from source: facts 11762 1726853311.91376: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853311.91495: variable 'network_state' from source: role '' defaults 11762 1726853311.91512: Evaluated conditional (network_state != {}): False 11762 1726853311.91520: when evaluation is False, skipping this task 11762 1726853311.91527: _execute() done 11762 1726853311.91533: dumping result to json 11762 1726853311.91546: done dumping result, returning 11762 1726853311.91556: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-d845-03d0-000000000e0c] 11762 1726853311.91565: sending task result for task 02083763-bbaf-d845-03d0-000000000e0c 11762 1726853311.91795: done sending task result for task 02083763-bbaf-d845-03d0-000000000e0c 11762 1726853311.91799: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853311.91848: no more pending results, returning what we have 11762 1726853311.91853: results queue empty 11762 1726853311.91854: checking for any_errors_fatal 11762 1726853311.91861: done checking for any_errors_fatal 11762 1726853311.91861: checking for max_fail_percentage 11762 1726853311.91864: done checking for max_fail_percentage 11762 1726853311.91865: checking to see if all hosts have failed and the running result is not ok 11762 1726853311.91866: done checking to see if all hosts have failed 11762 1726853311.91866: getting the remaining hosts for this loop 11762 1726853311.91869: done getting the remaining hosts for this loop 11762 1726853311.91874: getting the next task for host managed_node2 11762 1726853311.91883: done getting next task for host managed_node2 11762 1726853311.91887: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11762 1726853311.91893: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853311.91918: getting variables 11762 1726853311.91919: in VariableManager get_vars() 11762 1726853311.91966: Calling all_inventory to load vars for managed_node2 11762 1726853311.91969: Calling groups_inventory to load vars for managed_node2 11762 1726853311.92132: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853311.92141: Calling all_plugins_play to load vars for managed_node2 11762 1726853311.92144: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853311.92147: Calling groups_plugins_play to load vars for managed_node2 11762 1726853311.94005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853311.95745: done with get_vars() 11762 1726853311.95769: done getting variables 11762 1726853311.96033: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:28:31 -0400 (0:00:00.060) 0:01:02.391 ****** 11762 1726853311.96068: entering _queue_task() for managed_node2/fail 11762 1726853311.96799: worker is 1 (out of 1 available) 11762 1726853311.96812: exiting _queue_task() for managed_node2/fail 11762 1726853311.96824: done queuing things up, now waiting for results queue to drain 11762 1726853311.96826: waiting for pending results... 11762 1726853311.96959: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11762 1726853311.97693: in run() - task 02083763-bbaf-d845-03d0-000000000e0d 11762 1726853311.97698: variable 'ansible_search_path' from source: unknown 11762 1726853311.97701: variable 'ansible_search_path' from source: unknown 11762 1726853311.97704: calling self._execute() 11762 1726853311.97707: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853311.97710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853311.97796: variable 'omit' from source: magic vars 11762 1726853311.98438: variable 'ansible_distribution_major_version' from source: facts 11762 1726853311.98580: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853311.98788: variable 'network_state' from source: role '' defaults 11762 1726853311.98824: Evaluated conditional (network_state != {}): False 11762 1726853311.98832: when evaluation is False, skipping this task 11762 1726853311.98839: _execute() done 11762 1726853311.98846: dumping result to json 11762 1726853311.98853: done dumping result, returning 11762 1726853311.98863: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-d845-03d0-000000000e0d] 11762 1726853311.98879: sending task result for task 02083763-bbaf-d845-03d0-000000000e0d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853311.99032: no more pending results, returning what we have 11762 1726853311.99036: results queue empty 11762 1726853311.99037: checking for any_errors_fatal 11762 1726853311.99044: done checking for any_errors_fatal 11762 1726853311.99045: checking for max_fail_percentage 11762 1726853311.99047: done checking for max_fail_percentage 11762 1726853311.99048: checking to see if all hosts have failed and the running result is not ok 11762 1726853311.99048: done checking to see if all hosts have failed 11762 1726853311.99049: getting the remaining hosts for this loop 11762 1726853311.99051: done getting the remaining hosts for this loop 11762 1726853311.99055: getting the next task for host managed_node2 11762 1726853311.99063: done getting next task for host managed_node2 11762 1726853311.99067: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11762 1726853311.99074: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853311.99101: getting variables 11762 1726853311.99103: in VariableManager get_vars() 11762 1726853311.99149: Calling all_inventory to load vars for managed_node2 11762 1726853311.99152: Calling groups_inventory to load vars for managed_node2 11762 1726853311.99154: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853311.99165: Calling all_plugins_play to load vars for managed_node2 11762 1726853311.99168: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853311.99182: done sending task result for task 02083763-bbaf-d845-03d0-000000000e0d 11762 1726853311.99188: WORKER PROCESS EXITING 11762 1726853311.99376: Calling groups_plugins_play to load vars for managed_node2 11762 1726853312.01044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853312.02890: done with get_vars() 11762 1726853312.02933: done getting variables 11762 1726853312.02999: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:28:32 -0400 (0:00:00.069) 0:01:02.461 ****** 11762 1726853312.03067: entering _queue_task() for managed_node2/fail 11762 1726853312.03513: worker is 1 (out of 1 available) 11762 1726853312.03527: exiting _queue_task() for managed_node2/fail 11762 1726853312.03540: done queuing things up, now waiting for results queue to drain 11762 1726853312.03542: waiting for pending results... 11762 1726853312.04154: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11762 1726853312.04304: in run() - task 02083763-bbaf-d845-03d0-000000000e0e 11762 1726853312.04321: variable 'ansible_search_path' from source: unknown 11762 1726853312.04325: variable 'ansible_search_path' from source: unknown 11762 1726853312.04379: calling self._execute() 11762 1726853312.04461: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853312.04486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853312.04490: variable 'omit' from source: magic vars 11762 1726853312.04856: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.04975: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853312.05048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853312.07446: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853312.07519: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853312.07558: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853312.07599: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853312.07625: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853312.07706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.07733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.07759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.07976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.07980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.07982: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.07984: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11762 1726853312.08031: variable 'ansible_distribution' from source: facts 11762 1726853312.08040: variable '__network_rh_distros' from source: role '' defaults 11762 1726853312.08056: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11762 1726853312.08292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.08319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.08356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.08399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.08419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.08480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.08508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.08537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.08590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.08611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.08665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.08700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.08726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.08767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.08790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.09093: variable 'network_connections' from source: task vars 11762 1726853312.09115: variable 'port2_profile' from source: play vars 11762 1726853312.09211: variable 'port2_profile' from source: play vars 11762 1726853312.09214: variable 'port1_profile' from source: play vars 11762 1726853312.09258: variable 'port1_profile' from source: play vars 11762 1726853312.09270: variable 'controller_profile' from source: play vars 11762 1726853312.09332: variable 'controller_profile' from source: play vars 11762 1726853312.09376: variable 'network_state' from source: role '' defaults 11762 1726853312.09418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853312.09590: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853312.09633: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853312.09675: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853312.09752: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853312.09763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853312.09793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853312.09823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.09855: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853312.09893: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11762 1726853312.09976: when evaluation is False, skipping this task 11762 1726853312.09980: _execute() done 11762 1726853312.09983: dumping result to json 11762 1726853312.09985: done dumping result, returning 11762 1726853312.09987: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-d845-03d0-000000000e0e] 11762 1726853312.09989: sending task result for task 02083763-bbaf-d845-03d0-000000000e0e 11762 1726853312.10066: done sending task result for task 02083763-bbaf-d845-03d0-000000000e0e 11762 1726853312.10072: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11762 1726853312.10120: no more pending results, returning what we have 11762 1726853312.10124: results queue empty 11762 1726853312.10125: checking for any_errors_fatal 11762 1726853312.10130: done checking for any_errors_fatal 11762 1726853312.10131: checking for max_fail_percentage 11762 1726853312.10133: done checking for max_fail_percentage 11762 1726853312.10134: checking to see if all hosts have failed and the running result is not ok 11762 1726853312.10135: done checking to see if all hosts have failed 11762 1726853312.10135: getting the remaining hosts for this loop 11762 1726853312.10137: done getting the remaining hosts for this loop 11762 1726853312.10140: getting the next task for host managed_node2 11762 1726853312.10149: done getting next task for host managed_node2 11762 1726853312.10152: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11762 1726853312.10157: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853312.10181: getting variables 11762 1726853312.10182: in VariableManager get_vars() 11762 1726853312.10228: Calling all_inventory to load vars for managed_node2 11762 1726853312.10231: Calling groups_inventory to load vars for managed_node2 11762 1726853312.10233: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853312.10243: Calling all_plugins_play to load vars for managed_node2 11762 1726853312.10246: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853312.10248: Calling groups_plugins_play to load vars for managed_node2 11762 1726853312.11719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853312.13392: done with get_vars() 11762 1726853312.13416: done getting variables 11762 1726853312.13481: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:28:32 -0400 (0:00:00.104) 0:01:02.565 ****** 11762 1726853312.13518: entering _queue_task() for managed_node2/dnf 11762 1726853312.13867: worker is 1 (out of 1 available) 11762 1726853312.13885: exiting _queue_task() for managed_node2/dnf 11762 1726853312.13898: done queuing things up, now waiting for results queue to drain 11762 1726853312.13900: waiting for pending results... 11762 1726853312.14291: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11762 1726853312.14361: in run() - task 02083763-bbaf-d845-03d0-000000000e0f 11762 1726853312.14385: variable 'ansible_search_path' from source: unknown 11762 1726853312.14396: variable 'ansible_search_path' from source: unknown 11762 1726853312.14435: calling self._execute() 11762 1726853312.14540: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853312.14553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853312.14570: variable 'omit' from source: magic vars 11762 1726853312.14968: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.14987: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853312.15190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853312.17537: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853312.17541: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853312.17560: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853312.17601: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853312.17630: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853312.17716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.17752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.17784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.17827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.17845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.17965: variable 'ansible_distribution' from source: facts 11762 1726853312.17980: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.17999: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11762 1726853312.18115: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853312.18249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.18278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.18402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.18405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.18408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.18410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.18435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.18464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.18510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.18528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.18570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.18598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.18631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.18674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.18695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.18854: variable 'network_connections' from source: task vars 11762 1726853312.18875: variable 'port2_profile' from source: play vars 11762 1726853312.18947: variable 'port2_profile' from source: play vars 11762 1726853312.18961: variable 'port1_profile' from source: play vars 11762 1726853312.19016: variable 'port1_profile' from source: play vars 11762 1726853312.19029: variable 'controller_profile' from source: play vars 11762 1726853312.19091: variable 'controller_profile' from source: play vars 11762 1726853312.19162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853312.19379: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853312.19410: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853312.19488: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853312.19491: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853312.19530: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853312.19566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853312.19604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.19635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853312.19692: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853312.20031: variable 'network_connections' from source: task vars 11762 1726853312.20035: variable 'port2_profile' from source: play vars 11762 1726853312.20037: variable 'port2_profile' from source: play vars 11762 1726853312.20040: variable 'port1_profile' from source: play vars 11762 1726853312.20085: variable 'port1_profile' from source: play vars 11762 1726853312.20099: variable 'controller_profile' from source: play vars 11762 1726853312.20164: variable 'controller_profile' from source: play vars 11762 1726853312.20196: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853312.20204: when evaluation is False, skipping this task 11762 1726853312.20212: _execute() done 11762 1726853312.20221: dumping result to json 11762 1726853312.20228: done dumping result, returning 11762 1726853312.20239: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000e0f] 11762 1726853312.20253: sending task result for task 02083763-bbaf-d845-03d0-000000000e0f 11762 1726853312.20593: done sending task result for task 02083763-bbaf-d845-03d0-000000000e0f 11762 1726853312.20596: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853312.20645: no more pending results, returning what we have 11762 1726853312.20650: results queue empty 11762 1726853312.20650: checking for any_errors_fatal 11762 1726853312.20656: done checking for any_errors_fatal 11762 1726853312.20656: checking for max_fail_percentage 11762 1726853312.20659: done checking for max_fail_percentage 11762 1726853312.20659: checking to see if all hosts have failed and the running result is not ok 11762 1726853312.20660: done checking to see if all hosts have failed 11762 1726853312.20661: getting the remaining hosts for this loop 11762 1726853312.20663: done getting the remaining hosts for this loop 11762 1726853312.20666: getting the next task for host managed_node2 11762 1726853312.20676: done getting next task for host managed_node2 11762 1726853312.20680: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11762 1726853312.20685: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853312.20706: getting variables 11762 1726853312.20708: in VariableManager get_vars() 11762 1726853312.20752: Calling all_inventory to load vars for managed_node2 11762 1726853312.20755: Calling groups_inventory to load vars for managed_node2 11762 1726853312.20757: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853312.20765: Calling all_plugins_play to load vars for managed_node2 11762 1726853312.20768: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853312.20770: Calling groups_plugins_play to load vars for managed_node2 11762 1726853312.22167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853312.23741: done with get_vars() 11762 1726853312.23766: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11762 1726853312.23840: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:28:32 -0400 (0:00:00.103) 0:01:02.669 ****** 11762 1726853312.23874: entering _queue_task() for managed_node2/yum 11762 1726853312.24188: worker is 1 (out of 1 available) 11762 1726853312.24201: exiting _queue_task() for managed_node2/yum 11762 1726853312.24213: done queuing things up, now waiting for results queue to drain 11762 1726853312.24214: waiting for pending results... 11762 1726853312.24514: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11762 1726853312.24663: in run() - task 02083763-bbaf-d845-03d0-000000000e10 11762 1726853312.24685: variable 'ansible_search_path' from source: unknown 11762 1726853312.24696: variable 'ansible_search_path' from source: unknown 11762 1726853312.24734: calling self._execute() 11762 1726853312.24848: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853312.24862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853312.24880: variable 'omit' from source: magic vars 11762 1726853312.25285: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.25304: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853312.25494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853312.27919: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853312.27992: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853312.28036: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853312.28079: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853312.28116: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853312.28216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.28252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.28288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.28339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.28421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.28468: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.28495: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11762 1726853312.28505: when evaluation is False, skipping this task 11762 1726853312.28514: _execute() done 11762 1726853312.28522: dumping result to json 11762 1726853312.28538: done dumping result, returning 11762 1726853312.28552: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000e10] 11762 1726853312.28564: sending task result for task 02083763-bbaf-d845-03d0-000000000e10 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11762 1726853312.28826: no more pending results, returning what we have 11762 1726853312.28831: results queue empty 11762 1726853312.28832: checking for any_errors_fatal 11762 1726853312.28840: done checking for any_errors_fatal 11762 1726853312.28840: checking for max_fail_percentage 11762 1726853312.28843: done checking for max_fail_percentage 11762 1726853312.28844: checking to see if all hosts have failed and the running result is not ok 11762 1726853312.28844: done checking to see if all hosts have failed 11762 1726853312.28845: getting the remaining hosts for this loop 11762 1726853312.28847: done getting the remaining hosts for this loop 11762 1726853312.28851: getting the next task for host managed_node2 11762 1726853312.28859: done getting next task for host managed_node2 11762 1726853312.28864: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11762 1726853312.28869: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853312.29093: getting variables 11762 1726853312.29095: in VariableManager get_vars() 11762 1726853312.29139: Calling all_inventory to load vars for managed_node2 11762 1726853312.29142: Calling groups_inventory to load vars for managed_node2 11762 1726853312.29145: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853312.29153: Calling all_plugins_play to load vars for managed_node2 11762 1726853312.29156: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853312.29159: Calling groups_plugins_play to load vars for managed_node2 11762 1726853312.29744: done sending task result for task 02083763-bbaf-d845-03d0-000000000e10 11762 1726853312.29748: WORKER PROCESS EXITING 11762 1726853312.31334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853312.33092: done with get_vars() 11762 1726853312.33125: done getting variables 11762 1726853312.33187: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:28:32 -0400 (0:00:00.093) 0:01:02.762 ****** 11762 1726853312.33224: entering _queue_task() for managed_node2/fail 11762 1726853312.33577: worker is 1 (out of 1 available) 11762 1726853312.33593: exiting _queue_task() for managed_node2/fail 11762 1726853312.33606: done queuing things up, now waiting for results queue to drain 11762 1726853312.33608: waiting for pending results... 11762 1726853312.33874: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11762 1726853312.34147: in run() - task 02083763-bbaf-d845-03d0-000000000e11 11762 1726853312.34168: variable 'ansible_search_path' from source: unknown 11762 1726853312.34179: variable 'ansible_search_path' from source: unknown 11762 1726853312.34231: calling self._execute() 11762 1726853312.34342: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853312.34358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853312.34377: variable 'omit' from source: magic vars 11762 1726853312.34851: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.35077: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853312.35080: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853312.35183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853312.37243: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853312.37331: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853312.37415: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853312.37634: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853312.37663: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853312.37858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.37893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.37915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.38076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.38091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.38138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.38166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.38304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.38341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.38357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.38500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.38528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.38558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.38622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.38659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.39279: variable 'network_connections' from source: task vars 11762 1726853312.39283: variable 'port2_profile' from source: play vars 11762 1726853312.39285: variable 'port2_profile' from source: play vars 11762 1726853312.39287: variable 'port1_profile' from source: play vars 11762 1726853312.39321: variable 'port1_profile' from source: play vars 11762 1726853312.39335: variable 'controller_profile' from source: play vars 11762 1726853312.39405: variable 'controller_profile' from source: play vars 11762 1726853312.39578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853312.39676: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853312.39721: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853312.39756: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853312.39791: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853312.39876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853312.39879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853312.39894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.39932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853312.39991: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853312.40253: variable 'network_connections' from source: task vars 11762 1726853312.40263: variable 'port2_profile' from source: play vars 11762 1726853312.40342: variable 'port2_profile' from source: play vars 11762 1726853312.40347: variable 'port1_profile' from source: play vars 11762 1726853312.40408: variable 'port1_profile' from source: play vars 11762 1726853312.40454: variable 'controller_profile' from source: play vars 11762 1726853312.40506: variable 'controller_profile' from source: play vars 11762 1726853312.40533: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853312.40561: when evaluation is False, skipping this task 11762 1726853312.40669: _execute() done 11762 1726853312.40677: dumping result to json 11762 1726853312.40680: done dumping result, returning 11762 1726853312.40683: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000e11] 11762 1726853312.40685: sending task result for task 02083763-bbaf-d845-03d0-000000000e11 11762 1726853312.40761: done sending task result for task 02083763-bbaf-d845-03d0-000000000e11 11762 1726853312.40764: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853312.40829: no more pending results, returning what we have 11762 1726853312.40834: results queue empty 11762 1726853312.40834: checking for any_errors_fatal 11762 1726853312.40840: done checking for any_errors_fatal 11762 1726853312.40841: checking for max_fail_percentage 11762 1726853312.40846: done checking for max_fail_percentage 11762 1726853312.40847: checking to see if all hosts have failed and the running result is not ok 11762 1726853312.40848: done checking to see if all hosts have failed 11762 1726853312.40848: getting the remaining hosts for this loop 11762 1726853312.40851: done getting the remaining hosts for this loop 11762 1726853312.40854: getting the next task for host managed_node2 11762 1726853312.40863: done getting next task for host managed_node2 11762 1726853312.40867: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11762 1726853312.40875: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853312.40899: getting variables 11762 1726853312.40900: in VariableManager get_vars() 11762 1726853312.40950: Calling all_inventory to load vars for managed_node2 11762 1726853312.40953: Calling groups_inventory to load vars for managed_node2 11762 1726853312.40956: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853312.40965: Calling all_plugins_play to load vars for managed_node2 11762 1726853312.40968: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853312.40976: Calling groups_plugins_play to load vars for managed_node2 11762 1726853312.42758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853312.44892: done with get_vars() 11762 1726853312.44921: done getting variables 11762 1726853312.44984: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:28:32 -0400 (0:00:00.117) 0:01:02.880 ****** 11762 1726853312.45020: entering _queue_task() for managed_node2/package 11762 1726853312.45357: worker is 1 (out of 1 available) 11762 1726853312.45377: exiting _queue_task() for managed_node2/package 11762 1726853312.45392: done queuing things up, now waiting for results queue to drain 11762 1726853312.45394: waiting for pending results... 11762 1726853312.45716: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 11762 1726853312.45884: in run() - task 02083763-bbaf-d845-03d0-000000000e12 11762 1726853312.45934: variable 'ansible_search_path' from source: unknown 11762 1726853312.45938: variable 'ansible_search_path' from source: unknown 11762 1726853312.46027: calling self._execute() 11762 1726853312.46099: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853312.46117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853312.46148: variable 'omit' from source: magic vars 11762 1726853312.46681: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.46684: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853312.46868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853312.47230: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853312.47281: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853312.47318: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853312.47369: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853312.47491: variable 'network_packages' from source: role '' defaults 11762 1726853312.47669: variable '__network_provider_setup' from source: role '' defaults 11762 1726853312.47674: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853312.47738: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853312.47786: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853312.48020: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853312.48181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853312.50438: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853312.50497: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853312.50529: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853312.56239: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853312.56263: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853312.56336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.56363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.56395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.56433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.56448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.56489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.56514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.56537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.56575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.56588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.56817: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11762 1726853312.56936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.57047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.57050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.57053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.57055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.57117: variable 'ansible_python' from source: facts 11762 1726853312.57132: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11762 1726853312.57214: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853312.57288: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853312.57390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.57410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.57431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.57463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.57477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.57522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.57546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.57567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.57821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.57825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.57828: variable 'network_connections' from source: task vars 11762 1726853312.57830: variable 'port2_profile' from source: play vars 11762 1726853312.57879: variable 'port2_profile' from source: play vars 11762 1726853312.57890: variable 'port1_profile' from source: play vars 11762 1726853312.57992: variable 'port1_profile' from source: play vars 11762 1726853312.58003: variable 'controller_profile' from source: play vars 11762 1726853312.58105: variable 'controller_profile' from source: play vars 11762 1726853312.58175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853312.58204: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853312.58234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.58269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853312.58311: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853312.58603: variable 'network_connections' from source: task vars 11762 1726853312.58606: variable 'port2_profile' from source: play vars 11762 1726853312.58708: variable 'port2_profile' from source: play vars 11762 1726853312.58718: variable 'port1_profile' from source: play vars 11762 1726853312.58818: variable 'port1_profile' from source: play vars 11762 1726853312.58828: variable 'controller_profile' from source: play vars 11762 1726853312.58925: variable 'controller_profile' from source: play vars 11762 1726853312.58954: variable '__network_packages_default_wireless' from source: role '' defaults 11762 1726853312.59034: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853312.59391: variable 'network_connections' from source: task vars 11762 1726853312.59395: variable 'port2_profile' from source: play vars 11762 1726853312.59677: variable 'port2_profile' from source: play vars 11762 1726853312.59680: variable 'port1_profile' from source: play vars 11762 1726853312.59682: variable 'port1_profile' from source: play vars 11762 1726853312.59685: variable 'controller_profile' from source: play vars 11762 1726853312.59687: variable 'controller_profile' from source: play vars 11762 1726853312.59689: variable '__network_packages_default_team' from source: role '' defaults 11762 1726853312.59691: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853312.59966: variable 'network_connections' from source: task vars 11762 1726853312.59970: variable 'port2_profile' from source: play vars 11762 1726853312.60039: variable 'port2_profile' from source: play vars 11762 1726853312.60047: variable 'port1_profile' from source: play vars 11762 1726853312.60112: variable 'port1_profile' from source: play vars 11762 1726853312.60120: variable 'controller_profile' from source: play vars 11762 1726853312.60180: variable 'controller_profile' from source: play vars 11762 1726853312.60234: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853312.60290: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853312.60296: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853312.60356: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853312.60978: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11762 1726853312.61319: variable 'network_connections' from source: task vars 11762 1726853312.61322: variable 'port2_profile' from source: play vars 11762 1726853312.61385: variable 'port2_profile' from source: play vars 11762 1726853312.61394: variable 'port1_profile' from source: play vars 11762 1726853312.61459: variable 'port1_profile' from source: play vars 11762 1726853312.61466: variable 'controller_profile' from source: play vars 11762 1726853312.61523: variable 'controller_profile' from source: play vars 11762 1726853312.61536: variable 'ansible_distribution' from source: facts 11762 1726853312.61539: variable '__network_rh_distros' from source: role '' defaults 11762 1726853312.61547: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.61560: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11762 1726853312.61721: variable 'ansible_distribution' from source: facts 11762 1726853312.61725: variable '__network_rh_distros' from source: role '' defaults 11762 1726853312.61737: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.61748: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11762 1726853312.61910: variable 'ansible_distribution' from source: facts 11762 1726853312.61914: variable '__network_rh_distros' from source: role '' defaults 11762 1726853312.61967: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.61970: variable 'network_provider' from source: set_fact 11762 1726853312.61975: variable 'ansible_facts' from source: unknown 11762 1726853312.62646: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11762 1726853312.62649: when evaluation is False, skipping this task 11762 1726853312.62652: _execute() done 11762 1726853312.62654: dumping result to json 11762 1726853312.62656: done dumping result, returning 11762 1726853312.62661: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-d845-03d0-000000000e12] 11762 1726853312.62665: sending task result for task 02083763-bbaf-d845-03d0-000000000e12 11762 1726853312.62922: done sending task result for task 02083763-bbaf-d845-03d0-000000000e12 11762 1726853312.62924: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11762 1726853312.62969: no more pending results, returning what we have 11762 1726853312.62974: results queue empty 11762 1726853312.62975: checking for any_errors_fatal 11762 1726853312.62980: done checking for any_errors_fatal 11762 1726853312.62981: checking for max_fail_percentage 11762 1726853312.62983: done checking for max_fail_percentage 11762 1726853312.62984: checking to see if all hosts have failed and the running result is not ok 11762 1726853312.62985: done checking to see if all hosts have failed 11762 1726853312.62985: getting the remaining hosts for this loop 11762 1726853312.62987: done getting the remaining hosts for this loop 11762 1726853312.62995: getting the next task for host managed_node2 11762 1726853312.63002: done getting next task for host managed_node2 11762 1726853312.63006: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11762 1726853312.63010: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853312.63029: getting variables 11762 1726853312.63030: in VariableManager get_vars() 11762 1726853312.63072: Calling all_inventory to load vars for managed_node2 11762 1726853312.63075: Calling groups_inventory to load vars for managed_node2 11762 1726853312.63077: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853312.63085: Calling all_plugins_play to load vars for managed_node2 11762 1726853312.63088: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853312.63090: Calling groups_plugins_play to load vars for managed_node2 11762 1726853312.76548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853312.78809: done with get_vars() 11762 1726853312.78848: done getting variables 11762 1726853312.78916: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:28:32 -0400 (0:00:00.339) 0:01:03.219 ****** 11762 1726853312.78956: entering _queue_task() for managed_node2/package 11762 1726853312.79379: worker is 1 (out of 1 available) 11762 1726853312.79393: exiting _queue_task() for managed_node2/package 11762 1726853312.79407: done queuing things up, now waiting for results queue to drain 11762 1726853312.79409: waiting for pending results... 11762 1726853312.79897: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11762 1726853312.80202: in run() - task 02083763-bbaf-d845-03d0-000000000e13 11762 1726853312.80483: variable 'ansible_search_path' from source: unknown 11762 1726853312.80487: variable 'ansible_search_path' from source: unknown 11762 1726853312.80490: calling self._execute() 11762 1726853312.80492: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853312.80497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853312.80593: variable 'omit' from source: magic vars 11762 1726853312.81278: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.81293: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853312.81429: variable 'network_state' from source: role '' defaults 11762 1726853312.81441: Evaluated conditional (network_state != {}): False 11762 1726853312.81444: when evaluation is False, skipping this task 11762 1726853312.81449: _execute() done 11762 1726853312.81454: dumping result to json 11762 1726853312.81458: done dumping result, returning 11762 1726853312.81466: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-d845-03d0-000000000e13] 11762 1726853312.81475: sending task result for task 02083763-bbaf-d845-03d0-000000000e13 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853312.81641: no more pending results, returning what we have 11762 1726853312.81646: results queue empty 11762 1726853312.81647: checking for any_errors_fatal 11762 1726853312.81655: done checking for any_errors_fatal 11762 1726853312.81656: checking for max_fail_percentage 11762 1726853312.81657: done checking for max_fail_percentage 11762 1726853312.81658: checking to see if all hosts have failed and the running result is not ok 11762 1726853312.81659: done checking to see if all hosts have failed 11762 1726853312.81659: getting the remaining hosts for this loop 11762 1726853312.81661: done getting the remaining hosts for this loop 11762 1726853312.81664: getting the next task for host managed_node2 11762 1726853312.81676: done getting next task for host managed_node2 11762 1726853312.81683: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11762 1726853312.81689: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853312.81712: getting variables 11762 1726853312.81713: in VariableManager get_vars() 11762 1726853312.81757: Calling all_inventory to load vars for managed_node2 11762 1726853312.81760: Calling groups_inventory to load vars for managed_node2 11762 1726853312.81762: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853312.81870: Calling all_plugins_play to load vars for managed_node2 11762 1726853312.81876: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853312.81881: Calling groups_plugins_play to load vars for managed_node2 11762 1726853312.82399: done sending task result for task 02083763-bbaf-d845-03d0-000000000e13 11762 1726853312.82403: WORKER PROCESS EXITING 11762 1726853312.82700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853312.84411: done with get_vars() 11762 1726853312.84441: done getting variables 11762 1726853312.84514: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:28:32 -0400 (0:00:00.056) 0:01:03.276 ****** 11762 1726853312.84587: entering _queue_task() for managed_node2/package 11762 1726853312.84994: worker is 1 (out of 1 available) 11762 1726853312.85010: exiting _queue_task() for managed_node2/package 11762 1726853312.85026: done queuing things up, now waiting for results queue to drain 11762 1726853312.85027: waiting for pending results... 11762 1726853312.85295: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11762 1726853312.85408: in run() - task 02083763-bbaf-d845-03d0-000000000e14 11762 1726853312.85422: variable 'ansible_search_path' from source: unknown 11762 1726853312.85427: variable 'ansible_search_path' from source: unknown 11762 1726853312.85457: calling self._execute() 11762 1726853312.85536: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853312.85541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853312.85553: variable 'omit' from source: magic vars 11762 1726853312.85852: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.85861: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853312.85948: variable 'network_state' from source: role '' defaults 11762 1726853312.85958: Evaluated conditional (network_state != {}): False 11762 1726853312.85962: when evaluation is False, skipping this task 11762 1726853312.85965: _execute() done 11762 1726853312.85968: dumping result to json 11762 1726853312.85972: done dumping result, returning 11762 1726853312.85979: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-d845-03d0-000000000e14] 11762 1726853312.85984: sending task result for task 02083763-bbaf-d845-03d0-000000000e14 11762 1726853312.86080: done sending task result for task 02083763-bbaf-d845-03d0-000000000e14 11762 1726853312.86083: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853312.86158: no more pending results, returning what we have 11762 1726853312.86163: results queue empty 11762 1726853312.86163: checking for any_errors_fatal 11762 1726853312.86168: done checking for any_errors_fatal 11762 1726853312.86169: checking for max_fail_percentage 11762 1726853312.86172: done checking for max_fail_percentage 11762 1726853312.86173: checking to see if all hosts have failed and the running result is not ok 11762 1726853312.86174: done checking to see if all hosts have failed 11762 1726853312.86175: getting the remaining hosts for this loop 11762 1726853312.86176: done getting the remaining hosts for this loop 11762 1726853312.86180: getting the next task for host managed_node2 11762 1726853312.86188: done getting next task for host managed_node2 11762 1726853312.86192: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11762 1726853312.86197: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853312.86216: getting variables 11762 1726853312.86217: in VariableManager get_vars() 11762 1726853312.86257: Calling all_inventory to load vars for managed_node2 11762 1726853312.86259: Calling groups_inventory to load vars for managed_node2 11762 1726853312.86261: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853312.86270: Calling all_plugins_play to load vars for managed_node2 11762 1726853312.86280: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853312.86283: Calling groups_plugins_play to load vars for managed_node2 11762 1726853312.88207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853312.89727: done with get_vars() 11762 1726853312.89749: done getting variables 11762 1726853312.89797: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:28:32 -0400 (0:00:00.052) 0:01:03.328 ****** 11762 1726853312.89826: entering _queue_task() for managed_node2/service 11762 1726853312.90099: worker is 1 (out of 1 available) 11762 1726853312.90115: exiting _queue_task() for managed_node2/service 11762 1726853312.90127: done queuing things up, now waiting for results queue to drain 11762 1726853312.90129: waiting for pending results... 11762 1726853312.90322: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11762 1726853312.90426: in run() - task 02083763-bbaf-d845-03d0-000000000e15 11762 1726853312.90439: variable 'ansible_search_path' from source: unknown 11762 1726853312.90445: variable 'ansible_search_path' from source: unknown 11762 1726853312.90476: calling self._execute() 11762 1726853312.90561: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853312.90567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853312.90581: variable 'omit' from source: magic vars 11762 1726853312.90859: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.90868: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853312.90957: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853312.91127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853312.93800: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853312.93857: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853312.93885: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853312.93910: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853312.93931: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853312.93993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.94014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.94031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.94060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.94072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.94108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.94124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.94140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.94169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.94181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.94209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853312.94226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853312.94244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.94269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853312.94281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853312.94398: variable 'network_connections' from source: task vars 11762 1726853312.94411: variable 'port2_profile' from source: play vars 11762 1726853312.94461: variable 'port2_profile' from source: play vars 11762 1726853312.94472: variable 'port1_profile' from source: play vars 11762 1726853312.94517: variable 'port1_profile' from source: play vars 11762 1726853312.94526: variable 'controller_profile' from source: play vars 11762 1726853312.94568: variable 'controller_profile' from source: play vars 11762 1726853312.94623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853312.94742: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853312.94769: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853312.94792: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853312.94815: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853312.94850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853312.94862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853312.94881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853312.94898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853312.94941: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853312.95096: variable 'network_connections' from source: task vars 11762 1726853312.95099: variable 'port2_profile' from source: play vars 11762 1726853312.95142: variable 'port2_profile' from source: play vars 11762 1726853312.95148: variable 'port1_profile' from source: play vars 11762 1726853312.95192: variable 'port1_profile' from source: play vars 11762 1726853312.95199: variable 'controller_profile' from source: play vars 11762 1726853312.95239: variable 'controller_profile' from source: play vars 11762 1726853312.95260: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11762 1726853312.95275: when evaluation is False, skipping this task 11762 1726853312.95279: _execute() done 11762 1726853312.95283: dumping result to json 11762 1726853312.95285: done dumping result, returning 11762 1726853312.95287: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-d845-03d0-000000000e15] 11762 1726853312.95289: sending task result for task 02083763-bbaf-d845-03d0-000000000e15 11762 1726853312.95386: done sending task result for task 02083763-bbaf-d845-03d0-000000000e15 11762 1726853312.95388: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11762 1726853312.95433: no more pending results, returning what we have 11762 1726853312.95437: results queue empty 11762 1726853312.95437: checking for any_errors_fatal 11762 1726853312.95451: done checking for any_errors_fatal 11762 1726853312.95452: checking for max_fail_percentage 11762 1726853312.95454: done checking for max_fail_percentage 11762 1726853312.95455: checking to see if all hosts have failed and the running result is not ok 11762 1726853312.95456: done checking to see if all hosts have failed 11762 1726853312.95456: getting the remaining hosts for this loop 11762 1726853312.95458: done getting the remaining hosts for this loop 11762 1726853312.95461: getting the next task for host managed_node2 11762 1726853312.95469: done getting next task for host managed_node2 11762 1726853312.95474: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11762 1726853312.95479: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853312.95500: getting variables 11762 1726853312.95502: in VariableManager get_vars() 11762 1726853312.95548: Calling all_inventory to load vars for managed_node2 11762 1726853312.95551: Calling groups_inventory to load vars for managed_node2 11762 1726853312.95553: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853312.95562: Calling all_plugins_play to load vars for managed_node2 11762 1726853312.95564: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853312.95566: Calling groups_plugins_play to load vars for managed_node2 11762 1726853312.96849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853312.97848: done with get_vars() 11762 1726853312.97867: done getting variables 11762 1726853312.97912: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:28:32 -0400 (0:00:00.081) 0:01:03.409 ****** 11762 1726853312.97937: entering _queue_task() for managed_node2/service 11762 1726853312.98195: worker is 1 (out of 1 available) 11762 1726853312.98209: exiting _queue_task() for managed_node2/service 11762 1726853312.98224: done queuing things up, now waiting for results queue to drain 11762 1726853312.98226: waiting for pending results... 11762 1726853312.98434: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11762 1726853312.98540: in run() - task 02083763-bbaf-d845-03d0-000000000e16 11762 1726853312.98555: variable 'ansible_search_path' from source: unknown 11762 1726853312.98559: variable 'ansible_search_path' from source: unknown 11762 1726853312.98589: calling self._execute() 11762 1726853312.98663: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853312.98667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853312.98681: variable 'omit' from source: magic vars 11762 1726853312.98958: variable 'ansible_distribution_major_version' from source: facts 11762 1726853312.98968: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853312.99081: variable 'network_provider' from source: set_fact 11762 1726853312.99085: variable 'network_state' from source: role '' defaults 11762 1726853312.99094: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11762 1726853312.99099: variable 'omit' from source: magic vars 11762 1726853312.99157: variable 'omit' from source: magic vars 11762 1726853312.99178: variable 'network_service_name' from source: role '' defaults 11762 1726853312.99227: variable 'network_service_name' from source: role '' defaults 11762 1726853312.99302: variable '__network_provider_setup' from source: role '' defaults 11762 1726853312.99306: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853312.99355: variable '__network_service_name_default_nm' from source: role '' defaults 11762 1726853312.99362: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853312.99407: variable '__network_packages_default_nm' from source: role '' defaults 11762 1726853312.99559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853313.01011: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853313.01065: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853313.01095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853313.01119: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853313.01138: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853313.01201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853313.01222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853313.01239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.01265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853313.01278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853313.01312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853313.01328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853313.01347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.01370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853313.01382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853313.01529: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11762 1726853313.01600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853313.01619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853313.01636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.01661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853313.01673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853313.01733: variable 'ansible_python' from source: facts 11762 1726853313.01748: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11762 1726853313.01801: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853313.01855: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853313.01936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853313.01955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853313.01973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.01997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853313.02007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853313.02038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853313.02063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853313.02079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.02103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853313.02113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853313.02203: variable 'network_connections' from source: task vars 11762 1726853313.02209: variable 'port2_profile' from source: play vars 11762 1726853313.02261: variable 'port2_profile' from source: play vars 11762 1726853313.02273: variable 'port1_profile' from source: play vars 11762 1726853313.02324: variable 'port1_profile' from source: play vars 11762 1726853313.02333: variable 'controller_profile' from source: play vars 11762 1726853313.02387: variable 'controller_profile' from source: play vars 11762 1726853313.02456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853313.02583: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853313.02618: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853313.02650: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853313.02681: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853313.02723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853313.02747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853313.02767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.02791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853313.02830: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853313.03004: variable 'network_connections' from source: task vars 11762 1726853313.03010: variable 'port2_profile' from source: play vars 11762 1726853313.03065: variable 'port2_profile' from source: play vars 11762 1726853313.03076: variable 'port1_profile' from source: play vars 11762 1726853313.03126: variable 'port1_profile' from source: play vars 11762 1726853313.03136: variable 'controller_profile' from source: play vars 11762 1726853313.03189: variable 'controller_profile' from source: play vars 11762 1726853313.03212: variable '__network_packages_default_wireless' from source: role '' defaults 11762 1726853313.03269: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853313.03449: variable 'network_connections' from source: task vars 11762 1726853313.03452: variable 'port2_profile' from source: play vars 11762 1726853313.03504: variable 'port2_profile' from source: play vars 11762 1726853313.03510: variable 'port1_profile' from source: play vars 11762 1726853313.03558: variable 'port1_profile' from source: play vars 11762 1726853313.03564: variable 'controller_profile' from source: play vars 11762 1726853313.03615: variable 'controller_profile' from source: play vars 11762 1726853313.03632: variable '__network_packages_default_team' from source: role '' defaults 11762 1726853313.03687: variable '__network_team_connections_defined' from source: role '' defaults 11762 1726853313.03874: variable 'network_connections' from source: task vars 11762 1726853313.03878: variable 'port2_profile' from source: play vars 11762 1726853313.03928: variable 'port2_profile' from source: play vars 11762 1726853313.03932: variable 'port1_profile' from source: play vars 11762 1726853313.03981: variable 'port1_profile' from source: play vars 11762 1726853313.03987: variable 'controller_profile' from source: play vars 11762 1726853313.04036: variable 'controller_profile' from source: play vars 11762 1726853313.04073: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853313.04114: variable '__network_service_name_default_initscripts' from source: role '' defaults 11762 1726853313.04120: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853313.04165: variable '__network_packages_default_initscripts' from source: role '' defaults 11762 1726853313.04300: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11762 1726853313.04608: variable 'network_connections' from source: task vars 11762 1726853313.04611: variable 'port2_profile' from source: play vars 11762 1726853313.04654: variable 'port2_profile' from source: play vars 11762 1726853313.04661: variable 'port1_profile' from source: play vars 11762 1726853313.04705: variable 'port1_profile' from source: play vars 11762 1726853313.04711: variable 'controller_profile' from source: play vars 11762 1726853313.04751: variable 'controller_profile' from source: play vars 11762 1726853313.04757: variable 'ansible_distribution' from source: facts 11762 1726853313.04760: variable '__network_rh_distros' from source: role '' defaults 11762 1726853313.04766: variable 'ansible_distribution_major_version' from source: facts 11762 1726853313.04779: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11762 1726853313.04889: variable 'ansible_distribution' from source: facts 11762 1726853313.04892: variable '__network_rh_distros' from source: role '' defaults 11762 1726853313.04899: variable 'ansible_distribution_major_version' from source: facts 11762 1726853313.04911: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11762 1726853313.05018: variable 'ansible_distribution' from source: facts 11762 1726853313.05023: variable '__network_rh_distros' from source: role '' defaults 11762 1726853313.05025: variable 'ansible_distribution_major_version' from source: facts 11762 1726853313.05050: variable 'network_provider' from source: set_fact 11762 1726853313.05067: variable 'omit' from source: magic vars 11762 1726853313.05089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853313.05109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853313.05126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853313.05138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853313.05147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853313.05169: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853313.05174: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853313.05176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853313.05246: Set connection var ansible_timeout to 10 11762 1726853313.05249: Set connection var ansible_shell_type to sh 11762 1726853313.05251: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853313.05256: Set connection var ansible_shell_executable to /bin/sh 11762 1726853313.05262: Set connection var ansible_pipelining to False 11762 1726853313.05268: Set connection var ansible_connection to ssh 11762 1726853313.05288: variable 'ansible_shell_executable' from source: unknown 11762 1726853313.05291: variable 'ansible_connection' from source: unknown 11762 1726853313.05293: variable 'ansible_module_compression' from source: unknown 11762 1726853313.05295: variable 'ansible_shell_type' from source: unknown 11762 1726853313.05297: variable 'ansible_shell_executable' from source: unknown 11762 1726853313.05300: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853313.05304: variable 'ansible_pipelining' from source: unknown 11762 1726853313.05306: variable 'ansible_timeout' from source: unknown 11762 1726853313.05310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853313.05384: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853313.05393: variable 'omit' from source: magic vars 11762 1726853313.05398: starting attempt loop 11762 1726853313.05400: running the handler 11762 1726853313.05456: variable 'ansible_facts' from source: unknown 11762 1726853313.05920: _low_level_execute_command(): starting 11762 1726853313.05926: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853313.06435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853313.06439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.06441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853313.06443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853313.06446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.06492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853313.06495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853313.06497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853313.06582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853313.08451: stdout chunk (state=3): >>>/root <<< 11762 1726853313.08552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853313.08582: stderr chunk (state=3): >>><<< 11762 1726853313.08585: stdout chunk (state=3): >>><<< 11762 1726853313.08601: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853313.08611: _low_level_execute_command(): starting 11762 1726853313.08617: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665 `" && echo ansible-tmp-1726853313.0860083-14724-130458151730665="` echo /root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665 `" ) && sleep 0' 11762 1726853313.09053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853313.09056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.09059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853313.09061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853313.09063: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.09109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853313.09112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853313.09118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853313.09193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853313.11218: stdout chunk (state=3): >>>ansible-tmp-1726853313.0860083-14724-130458151730665=/root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665 <<< 11762 1726853313.11323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853313.11351: stderr chunk (state=3): >>><<< 11762 1726853313.11354: stdout chunk (state=3): >>><<< 11762 1726853313.11368: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853313.0860083-14724-130458151730665=/root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853313.11399: variable 'ansible_module_compression' from source: unknown 11762 1726853313.11445: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11762 1726853313.11492: variable 'ansible_facts' from source: unknown 11762 1726853313.11627: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665/AnsiballZ_systemd.py 11762 1726853313.11730: Sending initial data 11762 1726853313.11733: Sent initial data (156 bytes) 11762 1726853313.12167: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853313.12173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853313.12204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.12208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853313.12211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.12262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853313.12265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853313.12268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853313.12345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853313.14299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11762 1726853313.14304: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853313.14356: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853313.14426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpaguawgb7 /root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665/AnsiballZ_systemd.py <<< 11762 1726853313.14437: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665/AnsiballZ_systemd.py" <<< 11762 1726853313.14498: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpaguawgb7" to remote "/root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665/AnsiballZ_systemd.py" <<< 11762 1726853313.14501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665/AnsiballZ_systemd.py" <<< 11762 1726853313.15735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853313.15782: stderr chunk (state=3): >>><<< 11762 1726853313.15785: stdout chunk (state=3): >>><<< 11762 1726853313.15807: done transferring module to remote 11762 1726853313.15816: _low_level_execute_command(): starting 11762 1726853313.15821: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665/ /root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665/AnsiballZ_systemd.py && sleep 0' 11762 1726853313.16434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.16458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853313.16477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853313.16497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853313.16596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853313.18477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853313.18501: stderr chunk (state=3): >>><<< 11762 1726853313.18504: stdout chunk (state=3): >>><<< 11762 1726853313.18515: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853313.18518: _low_level_execute_command(): starting 11762 1726853313.18523: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665/AnsiballZ_systemd.py && sleep 0' 11762 1726853313.19104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853313.19209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853313.49528: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl cal<<< 11762 1726853313.49547: stdout chunk (state=3): >>>l org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4608000", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3301855232", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "886295000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "Priva<<< 11762 1726853313.49589: stdout chunk (state=3): >>>teIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11762 1726853313.51568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853313.51595: stderr chunk (state=3): >>><<< 11762 1726853313.51598: stdout chunk (state=3): >>><<< 11762 1726853313.51614: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4608000", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3301855232", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "886295000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853313.51734: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853313.51751: _low_level_execute_command(): starting 11762 1726853313.51755: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853313.0860083-14724-130458151730665/ > /dev/null 2>&1 && sleep 0' 11762 1726853313.52248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853313.52252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853313.52255: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853313.52257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.52338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853313.52394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853313.54277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853313.54303: stderr chunk (state=3): >>><<< 11762 1726853313.54306: stdout chunk (state=3): >>><<< 11762 1726853313.54318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853313.54324: handler run complete 11762 1726853313.54363: attempt loop complete, returning result 11762 1726853313.54366: _execute() done 11762 1726853313.54369: dumping result to json 11762 1726853313.54383: done dumping result, returning 11762 1726853313.54392: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-d845-03d0-000000000e16] 11762 1726853313.54396: sending task result for task 02083763-bbaf-d845-03d0-000000000e16 11762 1726853313.54630: done sending task result for task 02083763-bbaf-d845-03d0-000000000e16 11762 1726853313.54632: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853313.54693: no more pending results, returning what we have 11762 1726853313.54697: results queue empty 11762 1726853313.54698: checking for any_errors_fatal 11762 1726853313.54704: done checking for any_errors_fatal 11762 1726853313.54704: checking for max_fail_percentage 11762 1726853313.54706: done checking for max_fail_percentage 11762 1726853313.54707: checking to see if all hosts have failed and the running result is not ok 11762 1726853313.54711: done checking to see if all hosts have failed 11762 1726853313.54713: getting the remaining hosts for this loop 11762 1726853313.54716: done getting the remaining hosts for this loop 11762 1726853313.54719: getting the next task for host managed_node2 11762 1726853313.54726: done getting next task for host managed_node2 11762 1726853313.54729: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11762 1726853313.54734: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853313.54750: getting variables 11762 1726853313.54751: in VariableManager get_vars() 11762 1726853313.54790: Calling all_inventory to load vars for managed_node2 11762 1726853313.54793: Calling groups_inventory to load vars for managed_node2 11762 1726853313.54795: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853313.54803: Calling all_plugins_play to load vars for managed_node2 11762 1726853313.54806: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853313.54808: Calling groups_plugins_play to load vars for managed_node2 11762 1726853313.56092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853313.57783: done with get_vars() 11762 1726853313.57804: done getting variables 11762 1726853313.57851: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:28:33 -0400 (0:00:00.599) 0:01:04.009 ****** 11762 1726853313.57882: entering _queue_task() for managed_node2/service 11762 1726853313.58139: worker is 1 (out of 1 available) 11762 1726853313.58152: exiting _queue_task() for managed_node2/service 11762 1726853313.58167: done queuing things up, now waiting for results queue to drain 11762 1726853313.58169: waiting for pending results... 11762 1726853313.58364: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11762 1726853313.58694: in run() - task 02083763-bbaf-d845-03d0-000000000e17 11762 1726853313.58698: variable 'ansible_search_path' from source: unknown 11762 1726853313.58701: variable 'ansible_search_path' from source: unknown 11762 1726853313.58705: calling self._execute() 11762 1726853313.58707: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853313.58709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853313.58713: variable 'omit' from source: magic vars 11762 1726853313.59035: variable 'ansible_distribution_major_version' from source: facts 11762 1726853313.59077: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853313.59175: variable 'network_provider' from source: set_fact 11762 1726853313.59187: Evaluated conditional (network_provider == "nm"): True 11762 1726853313.59476: variable '__network_wpa_supplicant_required' from source: role '' defaults 11762 1726853313.59479: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11762 1726853313.59688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853313.62105: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853313.62177: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853313.62216: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853313.62256: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853313.62287: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853313.62368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853313.62404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853313.62431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.62479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853313.62497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853313.62547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853313.62577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853313.62605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.62648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853313.62666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853313.62710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853313.62736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853313.62766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.62807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853313.62824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853313.62970: variable 'network_connections' from source: task vars 11762 1726853313.62991: variable 'port2_profile' from source: play vars 11762 1726853313.63064: variable 'port2_profile' from source: play vars 11762 1726853313.63082: variable 'port1_profile' from source: play vars 11762 1726853313.63146: variable 'port1_profile' from source: play vars 11762 1726853313.63376: variable 'controller_profile' from source: play vars 11762 1726853313.63379: variable 'controller_profile' from source: play vars 11762 1726853313.63381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11762 1726853313.63480: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11762 1726853313.63520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11762 1726853313.63558: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11762 1726853313.63591: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11762 1726853313.63635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11762 1726853313.63664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11762 1726853313.63696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.63721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11762 1726853313.63774: variable '__network_wireless_connections_defined' from source: role '' defaults 11762 1726853313.64034: variable 'network_connections' from source: task vars 11762 1726853313.64044: variable 'port2_profile' from source: play vars 11762 1726853313.64122: variable 'port2_profile' from source: play vars 11762 1726853313.64139: variable 'port1_profile' from source: play vars 11762 1726853313.64206: variable 'port1_profile' from source: play vars 11762 1726853313.64334: variable 'controller_profile' from source: play vars 11762 1726853313.64337: variable 'controller_profile' from source: play vars 11762 1726853313.64339: Evaluated conditional (__network_wpa_supplicant_required): False 11762 1726853313.64381: when evaluation is False, skipping this task 11762 1726853313.64443: _execute() done 11762 1726853313.64447: dumping result to json 11762 1726853313.64453: done dumping result, returning 11762 1726853313.64456: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-d845-03d0-000000000e17] 11762 1726853313.64459: sending task result for task 02083763-bbaf-d845-03d0-000000000e17 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11762 1726853313.64738: no more pending results, returning what we have 11762 1726853313.64742: results queue empty 11762 1726853313.64743: checking for any_errors_fatal 11762 1726853313.64769: done checking for any_errors_fatal 11762 1726853313.64775: checking for max_fail_percentage 11762 1726853313.64778: done checking for max_fail_percentage 11762 1726853313.64779: checking to see if all hosts have failed and the running result is not ok 11762 1726853313.64780: done checking to see if all hosts have failed 11762 1726853313.64781: getting the remaining hosts for this loop 11762 1726853313.64783: done getting the remaining hosts for this loop 11762 1726853313.64787: getting the next task for host managed_node2 11762 1726853313.64797: done getting next task for host managed_node2 11762 1726853313.64802: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11762 1726853313.64807: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853313.64892: getting variables 11762 1726853313.64895: in VariableManager get_vars() 11762 1726853313.64946: Calling all_inventory to load vars for managed_node2 11762 1726853313.64949: Calling groups_inventory to load vars for managed_node2 11762 1726853313.64952: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853313.64962: Calling all_plugins_play to load vars for managed_node2 11762 1726853313.64966: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853313.64969: Calling groups_plugins_play to load vars for managed_node2 11762 1726853313.65720: done sending task result for task 02083763-bbaf-d845-03d0-000000000e17 11762 1726853313.65724: WORKER PROCESS EXITING 11762 1726853313.67017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853313.69011: done with get_vars() 11762 1726853313.69035: done getting variables 11762 1726853313.69099: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:28:33 -0400 (0:00:00.112) 0:01:04.121 ****** 11762 1726853313.69133: entering _queue_task() for managed_node2/service 11762 1726853313.69461: worker is 1 (out of 1 available) 11762 1726853313.69476: exiting _queue_task() for managed_node2/service 11762 1726853313.69490: done queuing things up, now waiting for results queue to drain 11762 1726853313.69491: waiting for pending results... 11762 1726853313.69800: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 11762 1726853313.69973: in run() - task 02083763-bbaf-d845-03d0-000000000e18 11762 1726853313.69996: variable 'ansible_search_path' from source: unknown 11762 1726853313.70004: variable 'ansible_search_path' from source: unknown 11762 1726853313.70046: calling self._execute() 11762 1726853313.70151: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853313.70165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853313.70186: variable 'omit' from source: magic vars 11762 1726853313.70567: variable 'ansible_distribution_major_version' from source: facts 11762 1726853313.70585: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853313.70709: variable 'network_provider' from source: set_fact 11762 1726853313.70727: Evaluated conditional (network_provider == "initscripts"): False 11762 1726853313.70734: when evaluation is False, skipping this task 11762 1726853313.70740: _execute() done 11762 1726853313.70748: dumping result to json 11762 1726853313.70830: done dumping result, returning 11762 1726853313.70834: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-d845-03d0-000000000e18] 11762 1726853313.70836: sending task result for task 02083763-bbaf-d845-03d0-000000000e18 11762 1726853313.70907: done sending task result for task 02083763-bbaf-d845-03d0-000000000e18 11762 1726853313.70910: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11762 1726853313.70977: no more pending results, returning what we have 11762 1726853313.70982: results queue empty 11762 1726853313.70983: checking for any_errors_fatal 11762 1726853313.70992: done checking for any_errors_fatal 11762 1726853313.70993: checking for max_fail_percentage 11762 1726853313.70995: done checking for max_fail_percentage 11762 1726853313.70996: checking to see if all hosts have failed and the running result is not ok 11762 1726853313.70997: done checking to see if all hosts have failed 11762 1726853313.70997: getting the remaining hosts for this loop 11762 1726853313.71000: done getting the remaining hosts for this loop 11762 1726853313.71003: getting the next task for host managed_node2 11762 1726853313.71012: done getting next task for host managed_node2 11762 1726853313.71016: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11762 1726853313.71022: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853313.71050: getting variables 11762 1726853313.71052: in VariableManager get_vars() 11762 1726853313.71101: Calling all_inventory to load vars for managed_node2 11762 1726853313.71103: Calling groups_inventory to load vars for managed_node2 11762 1726853313.71106: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853313.71117: Calling all_plugins_play to load vars for managed_node2 11762 1726853313.71120: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853313.71123: Calling groups_plugins_play to load vars for managed_node2 11762 1726853313.72700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853313.74484: done with get_vars() 11762 1726853313.74513: done getting variables 11762 1726853313.74625: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:28:33 -0400 (0:00:00.055) 0:01:04.177 ****** 11762 1726853313.74663: entering _queue_task() for managed_node2/copy 11762 1726853313.75005: worker is 1 (out of 1 available) 11762 1726853313.75020: exiting _queue_task() for managed_node2/copy 11762 1726853313.75034: done queuing things up, now waiting for results queue to drain 11762 1726853313.75036: waiting for pending results... 11762 1726853313.75401: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11762 1726853313.75426: in run() - task 02083763-bbaf-d845-03d0-000000000e19 11762 1726853313.75442: variable 'ansible_search_path' from source: unknown 11762 1726853313.75445: variable 'ansible_search_path' from source: unknown 11762 1726853313.75483: calling self._execute() 11762 1726853313.75580: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853313.75587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853313.75594: variable 'omit' from source: magic vars 11762 1726853313.76150: variable 'ansible_distribution_major_version' from source: facts 11762 1726853313.76154: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853313.76156: variable 'network_provider' from source: set_fact 11762 1726853313.76159: Evaluated conditional (network_provider == "initscripts"): False 11762 1726853313.76161: when evaluation is False, skipping this task 11762 1726853313.76163: _execute() done 11762 1726853313.76165: dumping result to json 11762 1726853313.76168: done dumping result, returning 11762 1726853313.76172: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-d845-03d0-000000000e19] 11762 1726853313.76176: sending task result for task 02083763-bbaf-d845-03d0-000000000e19 11762 1726853313.76242: done sending task result for task 02083763-bbaf-d845-03d0-000000000e19 11762 1726853313.76244: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11762 1726853313.76304: no more pending results, returning what we have 11762 1726853313.76308: results queue empty 11762 1726853313.76309: checking for any_errors_fatal 11762 1726853313.76315: done checking for any_errors_fatal 11762 1726853313.76316: checking for max_fail_percentage 11762 1726853313.76318: done checking for max_fail_percentage 11762 1726853313.76318: checking to see if all hosts have failed and the running result is not ok 11762 1726853313.76319: done checking to see if all hosts have failed 11762 1726853313.76320: getting the remaining hosts for this loop 11762 1726853313.76322: done getting the remaining hosts for this loop 11762 1726853313.76325: getting the next task for host managed_node2 11762 1726853313.76332: done getting next task for host managed_node2 11762 1726853313.76336: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11762 1726853313.76340: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853313.76360: getting variables 11762 1726853313.76362: in VariableManager get_vars() 11762 1726853313.76611: Calling all_inventory to load vars for managed_node2 11762 1726853313.76614: Calling groups_inventory to load vars for managed_node2 11762 1726853313.76616: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853313.76626: Calling all_plugins_play to load vars for managed_node2 11762 1726853313.76629: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853313.76632: Calling groups_plugins_play to load vars for managed_node2 11762 1726853313.78320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853313.80392: done with get_vars() 11762 1726853313.80418: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:28:33 -0400 (0:00:00.058) 0:01:04.235 ****** 11762 1726853313.80512: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11762 1726853313.80833: worker is 1 (out of 1 available) 11762 1726853313.80850: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11762 1726853313.80863: done queuing things up, now waiting for results queue to drain 11762 1726853313.80865: waiting for pending results... 11762 1726853313.81593: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11762 1726853313.81599: in run() - task 02083763-bbaf-d845-03d0-000000000e1a 11762 1726853313.81602: variable 'ansible_search_path' from source: unknown 11762 1726853313.81605: variable 'ansible_search_path' from source: unknown 11762 1726853313.81607: calling self._execute() 11762 1726853313.81610: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853313.81612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853313.81614: variable 'omit' from source: magic vars 11762 1726853313.81857: variable 'ansible_distribution_major_version' from source: facts 11762 1726853313.81868: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853313.81878: variable 'omit' from source: magic vars 11762 1726853313.81953: variable 'omit' from source: magic vars 11762 1726853313.82117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11762 1726853313.84311: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11762 1726853313.84380: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11762 1726853313.84415: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11762 1726853313.84445: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11762 1726853313.84473: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11762 1726853313.84678: variable 'network_provider' from source: set_fact 11762 1726853313.84690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11762 1726853313.84718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11762 1726853313.84743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11762 1726853313.84791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11762 1726853313.85003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11762 1726853313.85042: variable 'omit' from source: magic vars 11762 1726853313.85152: variable 'omit' from source: magic vars 11762 1726853313.85251: variable 'network_connections' from source: task vars 11762 1726853313.85263: variable 'port2_profile' from source: play vars 11762 1726853313.85628: variable 'port2_profile' from source: play vars 11762 1726853313.85636: variable 'port1_profile' from source: play vars 11762 1726853313.85697: variable 'port1_profile' from source: play vars 11762 1726853313.85822: variable 'controller_profile' from source: play vars 11762 1726853313.85930: variable 'controller_profile' from source: play vars 11762 1726853313.86209: variable 'omit' from source: magic vars 11762 1726853313.86217: variable '__lsr_ansible_managed' from source: task vars 11762 1726853313.86392: variable '__lsr_ansible_managed' from source: task vars 11762 1726853313.86808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11762 1726853313.87357: Loaded config def from plugin (lookup/template) 11762 1726853313.87361: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11762 1726853313.87389: File lookup term: get_ansible_managed.j2 11762 1726853313.87392: variable 'ansible_search_path' from source: unknown 11762 1726853313.87397: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11762 1726853313.87411: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11762 1726853313.87427: variable 'ansible_search_path' from source: unknown 11762 1726853313.93916: variable 'ansible_managed' from source: unknown 11762 1726853313.94177: variable 'omit' from source: magic vars 11762 1726853313.94181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853313.94184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853313.94187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853313.94189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853313.94191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853313.94193: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853313.94196: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853313.94198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853313.94243: Set connection var ansible_timeout to 10 11762 1726853313.94246: Set connection var ansible_shell_type to sh 11762 1726853313.94254: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853313.94260: Set connection var ansible_shell_executable to /bin/sh 11762 1726853313.94273: Set connection var ansible_pipelining to False 11762 1726853313.94281: Set connection var ansible_connection to ssh 11762 1726853313.94303: variable 'ansible_shell_executable' from source: unknown 11762 1726853313.94306: variable 'ansible_connection' from source: unknown 11762 1726853313.94308: variable 'ansible_module_compression' from source: unknown 11762 1726853313.94311: variable 'ansible_shell_type' from source: unknown 11762 1726853313.94313: variable 'ansible_shell_executable' from source: unknown 11762 1726853313.94315: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853313.94320: variable 'ansible_pipelining' from source: unknown 11762 1726853313.94322: variable 'ansible_timeout' from source: unknown 11762 1726853313.94331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853313.94501: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853313.94510: variable 'omit' from source: magic vars 11762 1726853313.94518: starting attempt loop 11762 1726853313.94521: running the handler 11762 1726853313.94576: _low_level_execute_command(): starting 11762 1726853313.94579: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853313.96090: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853313.96197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.96315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853313.96334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853313.96445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853313.98206: stdout chunk (state=3): >>>/root <<< 11762 1726853313.98246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853313.98378: stderr chunk (state=3): >>><<< 11762 1726853313.98381: stdout chunk (state=3): >>><<< 11762 1726853313.98405: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853313.98418: _low_level_execute_command(): starting 11762 1726853313.98424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914 `" && echo ansible-tmp-1726853313.9840581-14766-80999286068914="` echo /root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914 `" ) && sleep 0' 11762 1726853313.99747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853313.99751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.99754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853313.99756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853313.99758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853313.99785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853314.00003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853314.02218: stdout chunk (state=3): >>>ansible-tmp-1726853313.9840581-14766-80999286068914=/root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914 <<< 11762 1726853314.02223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853314.02225: stderr chunk (state=3): >>><<< 11762 1726853314.02227: stdout chunk (state=3): >>><<< 11762 1726853314.02392: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853313.9840581-14766-80999286068914=/root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853314.02879: variable 'ansible_module_compression' from source: unknown 11762 1726853314.02883: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11762 1726853314.02885: variable 'ansible_facts' from source: unknown 11762 1726853314.02887: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914/AnsiballZ_network_connections.py 11762 1726853314.03240: Sending initial data 11762 1726853314.03243: Sent initial data (167 bytes) 11762 1726853314.04551: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853314.04695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853314.04704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853314.04800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853314.06503: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 11762 1726853314.06507: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853314.06583: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853314.06652: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpowhz9ekd /root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914/AnsiballZ_network_connections.py <<< 11762 1726853314.06655: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914/AnsiballZ_network_connections.py" <<< 11762 1726853314.06758: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpowhz9ekd" to remote "/root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914/AnsiballZ_network_connections.py" <<< 11762 1726853314.09112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853314.09164: stderr chunk (state=3): >>><<< 11762 1726853314.09176: stdout chunk (state=3): >>><<< 11762 1726853314.09239: done transferring module to remote 11762 1726853314.09464: _low_level_execute_command(): starting 11762 1726853314.09468: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914/ /root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914/AnsiballZ_network_connections.py && sleep 0' 11762 1726853314.10585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853314.10645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853314.10658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853314.10669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853314.10681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853314.10740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853314.10776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853314.10842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853314.12938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853314.12977: stderr chunk (state=3): >>><<< 11762 1726853314.13038: stdout chunk (state=3): >>><<< 11762 1726853314.13087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853314.13090: _low_level_execute_command(): starting 11762 1726853314.13093: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914/AnsiballZ_network_connections.py && sleep 0' 11762 1726853314.14679: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853314.14983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853314.15131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853314.15245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853314.70126: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/5c7babf4-8462-428b-96e1-53b35f33a6df: error=unknown <<< 11762 1726853314.71895: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/0632f651-903c-44ef-ab96-c625e73a569b: error=unknown <<< 11762 1726853314.73840: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/61017df5-d8c8-402f-9503-fd0fc150036f: error=unknown <<< 11762 1726853314.74100: stdout chunk (state=3): >>> <<< 11762 1726853314.74114: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11762 1726853314.76095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853314.76132: stderr chunk (state=3): >>><<< 11762 1726853314.76135: stdout chunk (state=3): >>><<< 11762 1726853314.76153: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/5c7babf4-8462-428b-96e1-53b35f33a6df: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/0632f651-903c-44ef-ab96-c625e73a569b: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oazu6gd5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/61017df5-d8c8-402f-9503-fd0fc150036f: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853314.76188: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853314.76196: _low_level_execute_command(): starting 11762 1726853314.76201: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853313.9840581-14766-80999286068914/ > /dev/null 2>&1 && sleep 0' 11762 1726853314.76629: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853314.76637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853314.76663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853314.76669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853314.76678: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853314.76684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853314.76729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853314.76732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853314.76737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853314.76807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853314.78709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853314.78736: stderr chunk (state=3): >>><<< 11762 1726853314.78739: stdout chunk (state=3): >>><<< 11762 1726853314.78753: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853314.78759: handler run complete 11762 1726853314.78783: attempt loop complete, returning result 11762 1726853314.78786: _execute() done 11762 1726853314.78788: dumping result to json 11762 1726853314.78793: done dumping result, returning 11762 1726853314.78801: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-d845-03d0-000000000e1a] 11762 1726853314.78805: sending task result for task 02083763-bbaf-d845-03d0-000000000e1a 11762 1726853314.78908: done sending task result for task 02083763-bbaf-d845-03d0-000000000e1a 11762 1726853314.78911: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11762 1726853314.79042: no more pending results, returning what we have 11762 1726853314.79046: results queue empty 11762 1726853314.79046: checking for any_errors_fatal 11762 1726853314.79052: done checking for any_errors_fatal 11762 1726853314.79052: checking for max_fail_percentage 11762 1726853314.79054: done checking for max_fail_percentage 11762 1726853314.79055: checking to see if all hosts have failed and the running result is not ok 11762 1726853314.79056: done checking to see if all hosts have failed 11762 1726853314.79056: getting the remaining hosts for this loop 11762 1726853314.79058: done getting the remaining hosts for this loop 11762 1726853314.79061: getting the next task for host managed_node2 11762 1726853314.79068: done getting next task for host managed_node2 11762 1726853314.79073: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11762 1726853314.79077: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853314.79089: getting variables 11762 1726853314.79091: in VariableManager get_vars() 11762 1726853314.79132: Calling all_inventory to load vars for managed_node2 11762 1726853314.79135: Calling groups_inventory to load vars for managed_node2 11762 1726853314.79137: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853314.79146: Calling all_plugins_play to load vars for managed_node2 11762 1726853314.79148: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853314.79151: Calling groups_plugins_play to load vars for managed_node2 11762 1726853314.79949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853314.80921: done with get_vars() 11762 1726853314.80936: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:28:34 -0400 (0:00:01.004) 0:01:05.240 ****** 11762 1726853314.80999: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11762 1726853314.81220: worker is 1 (out of 1 available) 11762 1726853314.81233: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11762 1726853314.81247: done queuing things up, now waiting for results queue to drain 11762 1726853314.81249: waiting for pending results... 11762 1726853314.81435: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 11762 1726853314.81537: in run() - task 02083763-bbaf-d845-03d0-000000000e1b 11762 1726853314.81551: variable 'ansible_search_path' from source: unknown 11762 1726853314.81557: variable 'ansible_search_path' from source: unknown 11762 1726853314.81588: calling self._execute() 11762 1726853314.81660: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.81664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.81674: variable 'omit' from source: magic vars 11762 1726853314.81946: variable 'ansible_distribution_major_version' from source: facts 11762 1726853314.81958: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853314.82041: variable 'network_state' from source: role '' defaults 11762 1726853314.82053: Evaluated conditional (network_state != {}): False 11762 1726853314.82057: when evaluation is False, skipping this task 11762 1726853314.82059: _execute() done 11762 1726853314.82062: dumping result to json 11762 1726853314.82064: done dumping result, returning 11762 1726853314.82072: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-d845-03d0-000000000e1b] 11762 1726853314.82077: sending task result for task 02083763-bbaf-d845-03d0-000000000e1b 11762 1726853314.82159: done sending task result for task 02083763-bbaf-d845-03d0-000000000e1b 11762 1726853314.82162: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11762 1726853314.82213: no more pending results, returning what we have 11762 1726853314.82218: results queue empty 11762 1726853314.82218: checking for any_errors_fatal 11762 1726853314.82228: done checking for any_errors_fatal 11762 1726853314.82229: checking for max_fail_percentage 11762 1726853314.82230: done checking for max_fail_percentage 11762 1726853314.82231: checking to see if all hosts have failed and the running result is not ok 11762 1726853314.82232: done checking to see if all hosts have failed 11762 1726853314.82233: getting the remaining hosts for this loop 11762 1726853314.82235: done getting the remaining hosts for this loop 11762 1726853314.82237: getting the next task for host managed_node2 11762 1726853314.82244: done getting next task for host managed_node2 11762 1726853314.82247: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11762 1726853314.82252: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853314.82269: getting variables 11762 1726853314.82273: in VariableManager get_vars() 11762 1726853314.82307: Calling all_inventory to load vars for managed_node2 11762 1726853314.82309: Calling groups_inventory to load vars for managed_node2 11762 1726853314.82311: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853314.82319: Calling all_plugins_play to load vars for managed_node2 11762 1726853314.82321: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853314.82324: Calling groups_plugins_play to load vars for managed_node2 11762 1726853314.83051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853314.83910: done with get_vars() 11762 1726853314.83925: done getting variables 11762 1726853314.83964: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:28:34 -0400 (0:00:00.029) 0:01:05.270 ****** 11762 1726853314.84000: entering _queue_task() for managed_node2/debug 11762 1726853314.84203: worker is 1 (out of 1 available) 11762 1726853314.84217: exiting _queue_task() for managed_node2/debug 11762 1726853314.84230: done queuing things up, now waiting for results queue to drain 11762 1726853314.84232: waiting for pending results... 11762 1726853314.84405: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11762 1726853314.84507: in run() - task 02083763-bbaf-d845-03d0-000000000e1c 11762 1726853314.84519: variable 'ansible_search_path' from source: unknown 11762 1726853314.84523: variable 'ansible_search_path' from source: unknown 11762 1726853314.84551: calling self._execute() 11762 1726853314.84624: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.84628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.84636: variable 'omit' from source: magic vars 11762 1726853314.84908: variable 'ansible_distribution_major_version' from source: facts 11762 1726853314.84917: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853314.84926: variable 'omit' from source: magic vars 11762 1726853314.84975: variable 'omit' from source: magic vars 11762 1726853314.85000: variable 'omit' from source: magic vars 11762 1726853314.85033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853314.85060: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853314.85076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853314.85090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853314.85099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853314.85124: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853314.85128: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.85130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.85197: Set connection var ansible_timeout to 10 11762 1726853314.85201: Set connection var ansible_shell_type to sh 11762 1726853314.85204: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853314.85210: Set connection var ansible_shell_executable to /bin/sh 11762 1726853314.85217: Set connection var ansible_pipelining to False 11762 1726853314.85230: Set connection var ansible_connection to ssh 11762 1726853314.85242: variable 'ansible_shell_executable' from source: unknown 11762 1726853314.85247: variable 'ansible_connection' from source: unknown 11762 1726853314.85251: variable 'ansible_module_compression' from source: unknown 11762 1726853314.85253: variable 'ansible_shell_type' from source: unknown 11762 1726853314.85255: variable 'ansible_shell_executable' from source: unknown 11762 1726853314.85258: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.85262: variable 'ansible_pipelining' from source: unknown 11762 1726853314.85265: variable 'ansible_timeout' from source: unknown 11762 1726853314.85268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.85373: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853314.85383: variable 'omit' from source: magic vars 11762 1726853314.85388: starting attempt loop 11762 1726853314.85391: running the handler 11762 1726853314.85486: variable '__network_connections_result' from source: set_fact 11762 1726853314.85525: handler run complete 11762 1726853314.85538: attempt loop complete, returning result 11762 1726853314.85541: _execute() done 11762 1726853314.85543: dumping result to json 11762 1726853314.85550: done dumping result, returning 11762 1726853314.85561: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-d845-03d0-000000000e1c] 11762 1726853314.85563: sending task result for task 02083763-bbaf-d845-03d0-000000000e1c 11762 1726853314.85644: done sending task result for task 02083763-bbaf-d845-03d0-000000000e1c 11762 1726853314.85647: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 11762 1726853314.85719: no more pending results, returning what we have 11762 1726853314.85722: results queue empty 11762 1726853314.85723: checking for any_errors_fatal 11762 1726853314.85727: done checking for any_errors_fatal 11762 1726853314.85727: checking for max_fail_percentage 11762 1726853314.85729: done checking for max_fail_percentage 11762 1726853314.85729: checking to see if all hosts have failed and the running result is not ok 11762 1726853314.85730: done checking to see if all hosts have failed 11762 1726853314.85731: getting the remaining hosts for this loop 11762 1726853314.85732: done getting the remaining hosts for this loop 11762 1726853314.85735: getting the next task for host managed_node2 11762 1726853314.85741: done getting next task for host managed_node2 11762 1726853314.85744: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11762 1726853314.85748: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853314.85758: getting variables 11762 1726853314.85759: in VariableManager get_vars() 11762 1726853314.85795: Calling all_inventory to load vars for managed_node2 11762 1726853314.85797: Calling groups_inventory to load vars for managed_node2 11762 1726853314.85799: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853314.85806: Calling all_plugins_play to load vars for managed_node2 11762 1726853314.85809: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853314.85811: Calling groups_plugins_play to load vars for managed_node2 11762 1726853314.86681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853314.87525: done with get_vars() 11762 1726853314.87540: done getting variables 11762 1726853314.87584: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:28:34 -0400 (0:00:00.036) 0:01:05.306 ****** 11762 1726853314.87612: entering _queue_task() for managed_node2/debug 11762 1726853314.87821: worker is 1 (out of 1 available) 11762 1726853314.87835: exiting _queue_task() for managed_node2/debug 11762 1726853314.87847: done queuing things up, now waiting for results queue to drain 11762 1726853314.87849: waiting for pending results... 11762 1726853314.88033: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11762 1726853314.88135: in run() - task 02083763-bbaf-d845-03d0-000000000e1d 11762 1726853314.88148: variable 'ansible_search_path' from source: unknown 11762 1726853314.88152: variable 'ansible_search_path' from source: unknown 11762 1726853314.88183: calling self._execute() 11762 1726853314.88260: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.88265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.88276: variable 'omit' from source: magic vars 11762 1726853314.88552: variable 'ansible_distribution_major_version' from source: facts 11762 1726853314.88562: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853314.88568: variable 'omit' from source: magic vars 11762 1726853314.88616: variable 'omit' from source: magic vars 11762 1726853314.88647: variable 'omit' from source: magic vars 11762 1726853314.88678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853314.88705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853314.88721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853314.88736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853314.88748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853314.88768: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853314.88773: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.88776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.88847: Set connection var ansible_timeout to 10 11762 1726853314.88850: Set connection var ansible_shell_type to sh 11762 1726853314.88853: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853314.88855: Set connection var ansible_shell_executable to /bin/sh 11762 1726853314.88862: Set connection var ansible_pipelining to False 11762 1726853314.88867: Set connection var ansible_connection to ssh 11762 1726853314.88886: variable 'ansible_shell_executable' from source: unknown 11762 1726853314.88889: variable 'ansible_connection' from source: unknown 11762 1726853314.88892: variable 'ansible_module_compression' from source: unknown 11762 1726853314.88894: variable 'ansible_shell_type' from source: unknown 11762 1726853314.88896: variable 'ansible_shell_executable' from source: unknown 11762 1726853314.88898: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.88901: variable 'ansible_pipelining' from source: unknown 11762 1726853314.88904: variable 'ansible_timeout' from source: unknown 11762 1726853314.88908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.89011: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853314.89022: variable 'omit' from source: magic vars 11762 1726853314.89027: starting attempt loop 11762 1726853314.89030: running the handler 11762 1726853314.89070: variable '__network_connections_result' from source: set_fact 11762 1726853314.89127: variable '__network_connections_result' from source: set_fact 11762 1726853314.89216: handler run complete 11762 1726853314.89233: attempt loop complete, returning result 11762 1726853314.89236: _execute() done 11762 1726853314.89238: dumping result to json 11762 1726853314.89241: done dumping result, returning 11762 1726853314.89251: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-d845-03d0-000000000e1d] 11762 1726853314.89255: sending task result for task 02083763-bbaf-d845-03d0-000000000e1d 11762 1726853314.89351: done sending task result for task 02083763-bbaf-d845-03d0-000000000e1d 11762 1726853314.89354: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11762 1726853314.89466: no more pending results, returning what we have 11762 1726853314.89470: results queue empty 11762 1726853314.89472: checking for any_errors_fatal 11762 1726853314.89478: done checking for any_errors_fatal 11762 1726853314.89479: checking for max_fail_percentage 11762 1726853314.89480: done checking for max_fail_percentage 11762 1726853314.89481: checking to see if all hosts have failed and the running result is not ok 11762 1726853314.89481: done checking to see if all hosts have failed 11762 1726853314.89482: getting the remaining hosts for this loop 11762 1726853314.89483: done getting the remaining hosts for this loop 11762 1726853314.89486: getting the next task for host managed_node2 11762 1726853314.89492: done getting next task for host managed_node2 11762 1726853314.89495: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11762 1726853314.89498: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853314.89508: getting variables 11762 1726853314.89509: in VariableManager get_vars() 11762 1726853314.89547: Calling all_inventory to load vars for managed_node2 11762 1726853314.89558: Calling groups_inventory to load vars for managed_node2 11762 1726853314.89560: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853314.89566: Calling all_plugins_play to load vars for managed_node2 11762 1726853314.89568: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853314.89570: Calling groups_plugins_play to load vars for managed_node2 11762 1726853314.90320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853314.91185: done with get_vars() 11762 1726853314.91201: done getting variables 11762 1726853314.91240: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:28:34 -0400 (0:00:00.036) 0:01:05.343 ****** 11762 1726853314.91266: entering _queue_task() for managed_node2/debug 11762 1726853314.91498: worker is 1 (out of 1 available) 11762 1726853314.91512: exiting _queue_task() for managed_node2/debug 11762 1726853314.91526: done queuing things up, now waiting for results queue to drain 11762 1726853314.91528: waiting for pending results... 11762 1726853314.91706: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11762 1726853314.91812: in run() - task 02083763-bbaf-d845-03d0-000000000e1e 11762 1726853314.91824: variable 'ansible_search_path' from source: unknown 11762 1726853314.91828: variable 'ansible_search_path' from source: unknown 11762 1726853314.91861: calling self._execute() 11762 1726853314.91931: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.91935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.91948: variable 'omit' from source: magic vars 11762 1726853314.92230: variable 'ansible_distribution_major_version' from source: facts 11762 1726853314.92240: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853314.92326: variable 'network_state' from source: role '' defaults 11762 1726853314.92335: Evaluated conditional (network_state != {}): False 11762 1726853314.92339: when evaluation is False, skipping this task 11762 1726853314.92342: _execute() done 11762 1726853314.92347: dumping result to json 11762 1726853314.92350: done dumping result, returning 11762 1726853314.92356: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-d845-03d0-000000000e1e] 11762 1726853314.92361: sending task result for task 02083763-bbaf-d845-03d0-000000000e1e 11762 1726853314.92449: done sending task result for task 02083763-bbaf-d845-03d0-000000000e1e 11762 1726853314.92451: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 11762 1726853314.92500: no more pending results, returning what we have 11762 1726853314.92504: results queue empty 11762 1726853314.92505: checking for any_errors_fatal 11762 1726853314.92516: done checking for any_errors_fatal 11762 1726853314.92517: checking for max_fail_percentage 11762 1726853314.92519: done checking for max_fail_percentage 11762 1726853314.92520: checking to see if all hosts have failed and the running result is not ok 11762 1726853314.92521: done checking to see if all hosts have failed 11762 1726853314.92522: getting the remaining hosts for this loop 11762 1726853314.92524: done getting the remaining hosts for this loop 11762 1726853314.92526: getting the next task for host managed_node2 11762 1726853314.92533: done getting next task for host managed_node2 11762 1726853314.92536: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11762 1726853314.92541: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853314.92564: getting variables 11762 1726853314.92565: in VariableManager get_vars() 11762 1726853314.92601: Calling all_inventory to load vars for managed_node2 11762 1726853314.92604: Calling groups_inventory to load vars for managed_node2 11762 1726853314.92606: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853314.92614: Calling all_plugins_play to load vars for managed_node2 11762 1726853314.92616: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853314.92619: Calling groups_plugins_play to load vars for managed_node2 11762 1726853314.93546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853314.94402: done with get_vars() 11762 1726853314.94417: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:28:34 -0400 (0:00:00.032) 0:01:05.375 ****** 11762 1726853314.94491: entering _queue_task() for managed_node2/ping 11762 1726853314.94721: worker is 1 (out of 1 available) 11762 1726853314.94734: exiting _queue_task() for managed_node2/ping 11762 1726853314.94750: done queuing things up, now waiting for results queue to drain 11762 1726853314.94752: waiting for pending results... 11762 1726853314.94934: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 11762 1726853314.95087: in run() - task 02083763-bbaf-d845-03d0-000000000e1f 11762 1726853314.95091: variable 'ansible_search_path' from source: unknown 11762 1726853314.95094: variable 'ansible_search_path' from source: unknown 11762 1726853314.95151: calling self._execute() 11762 1726853314.95393: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.95397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.95400: variable 'omit' from source: magic vars 11762 1726853314.95585: variable 'ansible_distribution_major_version' from source: facts 11762 1726853314.95597: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853314.95606: variable 'omit' from source: magic vars 11762 1726853314.95669: variable 'omit' from source: magic vars 11762 1726853314.95703: variable 'omit' from source: magic vars 11762 1726853314.95780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853314.95784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853314.95802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853314.95939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853314.95952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853314.95956: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853314.95959: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.95961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.95964: Set connection var ansible_timeout to 10 11762 1726853314.95966: Set connection var ansible_shell_type to sh 11762 1726853314.95968: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853314.95972: Set connection var ansible_shell_executable to /bin/sh 11762 1726853314.95989: Set connection var ansible_pipelining to False 11762 1726853314.95991: Set connection var ansible_connection to ssh 11762 1726853314.96009: variable 'ansible_shell_executable' from source: unknown 11762 1726853314.96012: variable 'ansible_connection' from source: unknown 11762 1726853314.96015: variable 'ansible_module_compression' from source: unknown 11762 1726853314.96017: variable 'ansible_shell_type' from source: unknown 11762 1726853314.96019: variable 'ansible_shell_executable' from source: unknown 11762 1726853314.96021: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853314.96026: variable 'ansible_pipelining' from source: unknown 11762 1726853314.96028: variable 'ansible_timeout' from source: unknown 11762 1726853314.96032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853314.96223: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11762 1726853314.96233: variable 'omit' from source: magic vars 11762 1726853314.96239: starting attempt loop 11762 1726853314.96242: running the handler 11762 1726853314.96255: _low_level_execute_command(): starting 11762 1726853314.96266: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853314.96881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853314.96916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853314.96920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853314.96968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853314.96974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853314.97058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853314.98805: stdout chunk (state=3): >>>/root <<< 11762 1726853314.99059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853314.99062: stdout chunk (state=3): >>><<< 11762 1726853314.99064: stderr chunk (state=3): >>><<< 11762 1726853314.99068: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853314.99074: _low_level_execute_command(): starting 11762 1726853314.99077: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958 `" && echo ansible-tmp-1726853314.9896677-14808-268617922934958="` echo /root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958 `" ) && sleep 0' 11762 1726853314.99617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853314.99631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853314.99646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853314.99665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853314.99684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853314.99725: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853314.99812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853314.99840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.00053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.02020: stdout chunk (state=3): >>>ansible-tmp-1726853314.9896677-14808-268617922934958=/root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958 <<< 11762 1726853315.02173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.02177: stdout chunk (state=3): >>><<< 11762 1726853315.02179: stderr chunk (state=3): >>><<< 11762 1726853315.02385: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853314.9896677-14808-268617922934958=/root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853315.02389: variable 'ansible_module_compression' from source: unknown 11762 1726853315.02391: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11762 1726853315.02393: variable 'ansible_facts' from source: unknown 11762 1726853315.02419: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958/AnsiballZ_ping.py 11762 1726853315.02616: Sending initial data 11762 1726853315.02628: Sent initial data (153 bytes) 11762 1726853315.03183: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853315.03196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853315.03210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.03226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853315.03241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853315.03259: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853315.03281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.03379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.03400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.03493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.05221: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853315.05307: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853315.05389: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpofl1fmkk /root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958/AnsiballZ_ping.py <<< 11762 1726853315.05392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958/AnsiballZ_ping.py" <<< 11762 1726853315.05474: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpofl1fmkk" to remote "/root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958/AnsiballZ_ping.py" <<< 11762 1726853315.06376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.06379: stderr chunk (state=3): >>><<< 11762 1726853315.06382: stdout chunk (state=3): >>><<< 11762 1726853315.06384: done transferring module to remote 11762 1726853315.06396: _low_level_execute_command(): starting 11762 1726853315.06405: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958/ /root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958/AnsiballZ_ping.py && sleep 0' 11762 1726853315.07050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853315.07065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853315.07184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853315.07204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853315.07221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.07434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.09284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.09294: stdout chunk (state=3): >>><<< 11762 1726853315.09308: stderr chunk (state=3): >>><<< 11762 1726853315.09328: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853315.09342: _low_level_execute_command(): starting 11762 1726853315.09352: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958/AnsiballZ_ping.py && sleep 0' 11762 1726853315.09940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853315.09954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853315.09968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.09985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853315.10025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.10095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.10122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.10241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.25748: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11762 1726853315.27109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853315.27133: stderr chunk (state=3): >>><<< 11762 1726853315.27136: stdout chunk (state=3): >>><<< 11762 1726853315.27156: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853315.27180: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853315.27188: _low_level_execute_command(): starting 11762 1726853315.27193: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853314.9896677-14808-268617922934958/ > /dev/null 2>&1 && sleep 0' 11762 1726853315.27618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853315.27621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853315.27623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853315.27626: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.27628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.27677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.27682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.27757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.29665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.29689: stderr chunk (state=3): >>><<< 11762 1726853315.29692: stdout chunk (state=3): >>><<< 11762 1726853315.29708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853315.29715: handler run complete 11762 1726853315.29727: attempt loop complete, returning result 11762 1726853315.29729: _execute() done 11762 1726853315.29732: dumping result to json 11762 1726853315.29734: done dumping result, returning 11762 1726853315.29744: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-d845-03d0-000000000e1f] 11762 1726853315.29751: sending task result for task 02083763-bbaf-d845-03d0-000000000e1f 11762 1726853315.29839: done sending task result for task 02083763-bbaf-d845-03d0-000000000e1f 11762 1726853315.29849: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 11762 1726853315.29913: no more pending results, returning what we have 11762 1726853315.29917: results queue empty 11762 1726853315.29918: checking for any_errors_fatal 11762 1726853315.29924: done checking for any_errors_fatal 11762 1726853315.29924: checking for max_fail_percentage 11762 1726853315.29926: done checking for max_fail_percentage 11762 1726853315.29927: checking to see if all hosts have failed and the running result is not ok 11762 1726853315.29928: done checking to see if all hosts have failed 11762 1726853315.29928: getting the remaining hosts for this loop 11762 1726853315.29930: done getting the remaining hosts for this loop 11762 1726853315.29933: getting the next task for host managed_node2 11762 1726853315.29944: done getting next task for host managed_node2 11762 1726853315.29946: ^ task is: TASK: meta (role_complete) 11762 1726853315.29951: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853315.29970: getting variables 11762 1726853315.29973: in VariableManager get_vars() 11762 1726853315.30019: Calling all_inventory to load vars for managed_node2 11762 1726853315.30021: Calling groups_inventory to load vars for managed_node2 11762 1726853315.30023: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853315.30032: Calling all_plugins_play to load vars for managed_node2 11762 1726853315.30034: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853315.30037: Calling groups_plugins_play to load vars for managed_node2 11762 1726853315.30841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853315.31809: done with get_vars() 11762 1726853315.31826: done getting variables 11762 1726853315.31885: done queuing things up, now waiting for results queue to drain 11762 1726853315.31887: results queue empty 11762 1726853315.31887: checking for any_errors_fatal 11762 1726853315.31889: done checking for any_errors_fatal 11762 1726853315.31889: checking for max_fail_percentage 11762 1726853315.31890: done checking for max_fail_percentage 11762 1726853315.31890: checking to see if all hosts have failed and the running result is not ok 11762 1726853315.31891: done checking to see if all hosts have failed 11762 1726853315.31891: getting the remaining hosts for this loop 11762 1726853315.31892: done getting the remaining hosts for this loop 11762 1726853315.31894: getting the next task for host managed_node2 11762 1726853315.31896: done getting next task for host managed_node2 11762 1726853315.31898: ^ task is: TASK: Delete the device '{{ controller_device }}' 11762 1726853315.31899: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853315.31901: getting variables 11762 1726853315.31902: in VariableManager get_vars() 11762 1726853315.31914: Calling all_inventory to load vars for managed_node2 11762 1726853315.31916: Calling groups_inventory to load vars for managed_node2 11762 1726853315.31917: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853315.31921: Calling all_plugins_play to load vars for managed_node2 11762 1726853315.31922: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853315.31923: Calling groups_plugins_play to load vars for managed_node2 11762 1726853315.32545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853315.33386: done with get_vars() 11762 1726853315.33400: done getting variables 11762 1726853315.33431: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11762 1726853315.33519: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Friday 20 September 2024 13:28:35 -0400 (0:00:00.390) 0:01:05.765 ****** 11762 1726853315.33544: entering _queue_task() for managed_node2/command 11762 1726853315.33791: worker is 1 (out of 1 available) 11762 1726853315.33805: exiting _queue_task() for managed_node2/command 11762 1726853315.33821: done queuing things up, now waiting for results queue to drain 11762 1726853315.33823: waiting for pending results... 11762 1726853315.34013: running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' 11762 1726853315.34097: in run() - task 02083763-bbaf-d845-03d0-000000000e4f 11762 1726853315.34110: variable 'ansible_search_path' from source: unknown 11762 1726853315.34113: variable 'ansible_search_path' from source: unknown 11762 1726853315.34140: calling self._execute() 11762 1726853315.34220: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853315.34224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853315.34234: variable 'omit' from source: magic vars 11762 1726853315.34504: variable 'ansible_distribution_major_version' from source: facts 11762 1726853315.34514: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853315.34520: variable 'omit' from source: magic vars 11762 1726853315.34536: variable 'omit' from source: magic vars 11762 1726853315.34608: variable 'controller_device' from source: play vars 11762 1726853315.34624: variable 'omit' from source: magic vars 11762 1726853315.34658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853315.34687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853315.34708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853315.34720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853315.34731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853315.34756: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853315.34759: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853315.34762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853315.34834: Set connection var ansible_timeout to 10 11762 1726853315.34837: Set connection var ansible_shell_type to sh 11762 1726853315.34842: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853315.34849: Set connection var ansible_shell_executable to /bin/sh 11762 1726853315.34856: Set connection var ansible_pipelining to False 11762 1726853315.34863: Set connection var ansible_connection to ssh 11762 1726853315.34881: variable 'ansible_shell_executable' from source: unknown 11762 1726853315.34884: variable 'ansible_connection' from source: unknown 11762 1726853315.34887: variable 'ansible_module_compression' from source: unknown 11762 1726853315.34889: variable 'ansible_shell_type' from source: unknown 11762 1726853315.34891: variable 'ansible_shell_executable' from source: unknown 11762 1726853315.34894: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853315.34896: variable 'ansible_pipelining' from source: unknown 11762 1726853315.34898: variable 'ansible_timeout' from source: unknown 11762 1726853315.34903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853315.35007: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853315.35017: variable 'omit' from source: magic vars 11762 1726853315.35024: starting attempt loop 11762 1726853315.35028: running the handler 11762 1726853315.35041: _low_level_execute_command(): starting 11762 1726853315.35051: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853315.35560: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.35564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.35567: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.35625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.35628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853315.35634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.35712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.37460: stdout chunk (state=3): >>>/root <<< 11762 1726853315.37556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.37589: stderr chunk (state=3): >>><<< 11762 1726853315.37592: stdout chunk (state=3): >>><<< 11762 1726853315.37620: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853315.37626: _low_level_execute_command(): starting 11762 1726853315.37633: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516 `" && echo ansible-tmp-1726853315.3761308-14825-214601226992516="` echo /root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516 `" ) && sleep 0' 11762 1726853315.38085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.38088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853315.38091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.38102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.38104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.38144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.38148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853315.38152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.38226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.40205: stdout chunk (state=3): >>>ansible-tmp-1726853315.3761308-14825-214601226992516=/root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516 <<< 11762 1726853315.40316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.40346: stderr chunk (state=3): >>><<< 11762 1726853315.40352: stdout chunk (state=3): >>><<< 11762 1726853315.40368: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853315.3761308-14825-214601226992516=/root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853315.40397: variable 'ansible_module_compression' from source: unknown 11762 1726853315.40437: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853315.40475: variable 'ansible_facts' from source: unknown 11762 1726853315.40529: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516/AnsiballZ_command.py 11762 1726853315.40635: Sending initial data 11762 1726853315.40638: Sent initial data (156 bytes) 11762 1726853315.41276: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.41293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.41335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.41352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853315.41384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.41483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.43159: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853315.43245: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853315.43333: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpu5b_okow /root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516/AnsiballZ_command.py <<< 11762 1726853315.43337: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516/AnsiballZ_command.py" <<< 11762 1726853315.43404: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpu5b_okow" to remote "/root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516/AnsiballZ_command.py" <<< 11762 1726853315.44978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.44982: stderr chunk (state=3): >>><<< 11762 1726853315.44984: stdout chunk (state=3): >>><<< 11762 1726853315.44986: done transferring module to remote 11762 1726853315.44988: _low_level_execute_command(): starting 11762 1726853315.44991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516/ /root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516/AnsiballZ_command.py && sleep 0' 11762 1726853315.46364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.46379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.46382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.46454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.48347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.48404: stderr chunk (state=3): >>><<< 11762 1726853315.48412: stdout chunk (state=3): >>><<< 11762 1726853315.48477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853315.48481: _low_level_execute_command(): starting 11762 1726853315.48483: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516/AnsiballZ_command.py && sleep 0' 11762 1726853315.49064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853315.49080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853315.49095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.49118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853315.49135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853315.49148: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853315.49161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.49181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853315.49238: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.49286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.49303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853315.49322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.49446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.65950: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 13:28:35.650872", "end": "2024-09-20 13:28:35.658464", "delta": "0:00:00.007592", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853315.67681: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 11762 1726853315.67685: stdout chunk (state=3): >>><<< 11762 1726853315.67687: stderr chunk (state=3): >>><<< 11762 1726853315.67690: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 13:28:35.650872", "end": "2024-09-20 13:28:35.658464", "delta": "0:00:00.007592", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 11762 1726853315.67722: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853315.67807: _low_level_execute_command(): starting 11762 1726853315.67810: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853315.3761308-14825-214601226992516/ > /dev/null 2>&1 && sleep 0' 11762 1726853315.68323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.68342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853315.68378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.68413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.68419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853315.68421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.68498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.70396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.70430: stderr chunk (state=3): >>><<< 11762 1726853315.70432: stdout chunk (state=3): >>><<< 11762 1726853315.70447: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853315.70450: handler run complete 11762 1726853315.70468: Evaluated conditional (False): False 11762 1726853315.70472: Evaluated conditional (False): False 11762 1726853315.70483: attempt loop complete, returning result 11762 1726853315.70485: _execute() done 11762 1726853315.70488: dumping result to json 11762 1726853315.70492: done dumping result, returning 11762 1726853315.70500: done running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' [02083763-bbaf-d845-03d0-000000000e4f] 11762 1726853315.70505: sending task result for task 02083763-bbaf-d845-03d0-000000000e4f 11762 1726853315.70606: done sending task result for task 02083763-bbaf-d845-03d0-000000000e4f 11762 1726853315.70608: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007592", "end": "2024-09-20 13:28:35.658464", "failed_when_result": false, "rc": 1, "start": "2024-09-20 13:28:35.650872" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11762 1726853315.70700: no more pending results, returning what we have 11762 1726853315.70704: results queue empty 11762 1726853315.70705: checking for any_errors_fatal 11762 1726853315.70706: done checking for any_errors_fatal 11762 1726853315.70707: checking for max_fail_percentage 11762 1726853315.70709: done checking for max_fail_percentage 11762 1726853315.70710: checking to see if all hosts have failed and the running result is not ok 11762 1726853315.70711: done checking to see if all hosts have failed 11762 1726853315.70711: getting the remaining hosts for this loop 11762 1726853315.70713: done getting the remaining hosts for this loop 11762 1726853315.70716: getting the next task for host managed_node2 11762 1726853315.70728: done getting next task for host managed_node2 11762 1726853315.70730: ^ task is: TASK: Remove test interfaces 11762 1726853315.70733: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853315.70737: getting variables 11762 1726853315.70740: in VariableManager get_vars() 11762 1726853315.70786: Calling all_inventory to load vars for managed_node2 11762 1726853315.70789: Calling groups_inventory to load vars for managed_node2 11762 1726853315.70791: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853315.70801: Calling all_plugins_play to load vars for managed_node2 11762 1726853315.70804: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853315.70806: Calling groups_plugins_play to load vars for managed_node2 11762 1726853315.71748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853315.72600: done with get_vars() 11762 1726853315.72619: done getting variables 11762 1726853315.72666: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:28:35 -0400 (0:00:00.391) 0:01:06.157 ****** 11762 1726853315.72691: entering _queue_task() for managed_node2/shell 11762 1726853315.72947: worker is 1 (out of 1 available) 11762 1726853315.72960: exiting _queue_task() for managed_node2/shell 11762 1726853315.72975: done queuing things up, now waiting for results queue to drain 11762 1726853315.72977: waiting for pending results... 11762 1726853315.73158: running TaskExecutor() for managed_node2/TASK: Remove test interfaces 11762 1726853315.73242: in run() - task 02083763-bbaf-d845-03d0-000000000e55 11762 1726853315.73255: variable 'ansible_search_path' from source: unknown 11762 1726853315.73259: variable 'ansible_search_path' from source: unknown 11762 1726853315.73288: calling self._execute() 11762 1726853315.73368: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853315.73374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853315.73382: variable 'omit' from source: magic vars 11762 1726853315.73663: variable 'ansible_distribution_major_version' from source: facts 11762 1726853315.73675: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853315.73681: variable 'omit' from source: magic vars 11762 1726853315.73714: variable 'omit' from source: magic vars 11762 1726853315.73826: variable 'dhcp_interface1' from source: play vars 11762 1726853315.73830: variable 'dhcp_interface2' from source: play vars 11762 1726853315.73849: variable 'omit' from source: magic vars 11762 1726853315.73885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853315.73912: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853315.73927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853315.73940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853315.73950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853315.73978: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853315.73982: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853315.73984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853315.74048: Set connection var ansible_timeout to 10 11762 1726853315.74057: Set connection var ansible_shell_type to sh 11762 1726853315.74063: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853315.74068: Set connection var ansible_shell_executable to /bin/sh 11762 1726853315.74081: Set connection var ansible_pipelining to False 11762 1726853315.74083: Set connection var ansible_connection to ssh 11762 1726853315.74100: variable 'ansible_shell_executable' from source: unknown 11762 1726853315.74103: variable 'ansible_connection' from source: unknown 11762 1726853315.74106: variable 'ansible_module_compression' from source: unknown 11762 1726853315.74108: variable 'ansible_shell_type' from source: unknown 11762 1726853315.74110: variable 'ansible_shell_executable' from source: unknown 11762 1726853315.74112: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853315.74116: variable 'ansible_pipelining' from source: unknown 11762 1726853315.74119: variable 'ansible_timeout' from source: unknown 11762 1726853315.74122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853315.74227: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853315.74237: variable 'omit' from source: magic vars 11762 1726853315.74242: starting attempt loop 11762 1726853315.74247: running the handler 11762 1726853315.74256: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853315.74273: _low_level_execute_command(): starting 11762 1726853315.74280: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853315.74801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.74805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.74808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.74810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.74856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.74862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853315.74880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.74951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.76706: stdout chunk (state=3): >>>/root <<< 11762 1726853315.76810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.76837: stderr chunk (state=3): >>><<< 11762 1726853315.76841: stdout chunk (state=3): >>><<< 11762 1726853315.76863: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853315.76876: _low_level_execute_command(): starting 11762 1726853315.76883: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857 `" && echo ansible-tmp-1726853315.7686234-14849-208194389532857="` echo /root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857 `" ) && sleep 0' 11762 1726853315.77322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853315.77333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853315.77336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11762 1726853315.77338: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.77340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.77377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.77395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.77470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.79518: stdout chunk (state=3): >>>ansible-tmp-1726853315.7686234-14849-208194389532857=/root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857 <<< 11762 1726853315.79627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.79654: stderr chunk (state=3): >>><<< 11762 1726853315.79658: stdout chunk (state=3): >>><<< 11762 1726853315.79672: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853315.7686234-14849-208194389532857=/root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853315.79699: variable 'ansible_module_compression' from source: unknown 11762 1726853315.79746: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853315.79774: variable 'ansible_facts' from source: unknown 11762 1726853315.79830: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857/AnsiballZ_command.py 11762 1726853315.79929: Sending initial data 11762 1726853315.79933: Sent initial data (156 bytes) 11762 1726853315.80383: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.80386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.80390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853315.80392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853315.80394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.80440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.80443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.80522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.82208: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11762 1726853315.82214: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853315.82275: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853315.82347: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpj4rz9rt1 /root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857/AnsiballZ_command.py <<< 11762 1726853315.82350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857/AnsiballZ_command.py" <<< 11762 1726853315.82415: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpj4rz9rt1" to remote "/root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857/AnsiballZ_command.py" <<< 11762 1726853315.82418: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857/AnsiballZ_command.py" <<< 11762 1726853315.83062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.83104: stderr chunk (state=3): >>><<< 11762 1726853315.83108: stdout chunk (state=3): >>><<< 11762 1726853315.83132: done transferring module to remote 11762 1726853315.83141: _low_level_execute_command(): starting 11762 1726853315.83148: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857/ /root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857/AnsiballZ_command.py && sleep 0' 11762 1726853315.83586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853315.83589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853315.83591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.83597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853315.83600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853315.83601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.83649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853315.83655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.83722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853315.85613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853315.85635: stderr chunk (state=3): >>><<< 11762 1726853315.85638: stdout chunk (state=3): >>><<< 11762 1726853315.85652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853315.85655: _low_level_execute_command(): starting 11762 1726853315.85660: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857/AnsiballZ_command.py && sleep 0' 11762 1726853315.86082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853315.86085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.86087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853315.86089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853315.86091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853315.86145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853315.86148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853315.86223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.06676: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:28:36.021619", "end": "2024-09-20 13:28:36.065264", "delta": "0:00:00.043645", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853316.08312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853316.08338: stderr chunk (state=3): >>><<< 11762 1726853316.08342: stdout chunk (state=3): >>><<< 11762 1726853316.08364: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:28:36.021619", "end": "2024-09-20 13:28:36.065264", "delta": "0:00:00.043645", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853316.08400: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853316.08407: _low_level_execute_command(): starting 11762 1726853316.08412: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853315.7686234-14849-208194389532857/ > /dev/null 2>&1 && sleep 0' 11762 1726853316.08848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853316.08852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853316.08882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853316.08885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 11762 1726853316.08887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853316.08889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.08950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853316.08954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853316.08956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.09026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.10941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.10977: stderr chunk (state=3): >>><<< 11762 1726853316.10980: stdout chunk (state=3): >>><<< 11762 1726853316.10989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853316.10995: handler run complete 11762 1726853316.11011: Evaluated conditional (False): False 11762 1726853316.11019: attempt loop complete, returning result 11762 1726853316.11023: _execute() done 11762 1726853316.11025: dumping result to json 11762 1726853316.11030: done dumping result, returning 11762 1726853316.11037: done running TaskExecutor() for managed_node2/TASK: Remove test interfaces [02083763-bbaf-d845-03d0-000000000e55] 11762 1726853316.11042: sending task result for task 02083763-bbaf-d845-03d0-000000000e55 11762 1726853316.11137: done sending task result for task 02083763-bbaf-d845-03d0-000000000e55 11762 1726853316.11140: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.043645", "end": "2024-09-20 13:28:36.065264", "rc": 0, "start": "2024-09-20 13:28:36.021619" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11762 1726853316.11204: no more pending results, returning what we have 11762 1726853316.11207: results queue empty 11762 1726853316.11208: checking for any_errors_fatal 11762 1726853316.11220: done checking for any_errors_fatal 11762 1726853316.11220: checking for max_fail_percentage 11762 1726853316.11223: done checking for max_fail_percentage 11762 1726853316.11223: checking to see if all hosts have failed and the running result is not ok 11762 1726853316.11224: done checking to see if all hosts have failed 11762 1726853316.11225: getting the remaining hosts for this loop 11762 1726853316.11226: done getting the remaining hosts for this loop 11762 1726853316.11229: getting the next task for host managed_node2 11762 1726853316.11237: done getting next task for host managed_node2 11762 1726853316.11240: ^ task is: TASK: Stop dnsmasq/radvd services 11762 1726853316.11244: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853316.11251: getting variables 11762 1726853316.11252: in VariableManager get_vars() 11762 1726853316.11306: Calling all_inventory to load vars for managed_node2 11762 1726853316.11309: Calling groups_inventory to load vars for managed_node2 11762 1726853316.11310: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853316.11325: Calling all_plugins_play to load vars for managed_node2 11762 1726853316.11328: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853316.11330: Calling groups_plugins_play to load vars for managed_node2 11762 1726853316.12643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853316.13819: done with get_vars() 11762 1726853316.13838: done getting variables 11762 1726853316.13887: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 13:28:36 -0400 (0:00:00.412) 0:01:06.569 ****** 11762 1726853316.13910: entering _queue_task() for managed_node2/shell 11762 1726853316.14151: worker is 1 (out of 1 available) 11762 1726853316.14165: exiting _queue_task() for managed_node2/shell 11762 1726853316.14181: done queuing things up, now waiting for results queue to drain 11762 1726853316.14183: waiting for pending results... 11762 1726853316.14364: running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services 11762 1726853316.14451: in run() - task 02083763-bbaf-d845-03d0-000000000e56 11762 1726853316.14461: variable 'ansible_search_path' from source: unknown 11762 1726853316.14464: variable 'ansible_search_path' from source: unknown 11762 1726853316.14493: calling self._execute() 11762 1726853316.14573: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853316.14577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853316.14586: variable 'omit' from source: magic vars 11762 1726853316.14856: variable 'ansible_distribution_major_version' from source: facts 11762 1726853316.14866: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853316.14874: variable 'omit' from source: magic vars 11762 1726853316.14909: variable 'omit' from source: magic vars 11762 1726853316.14933: variable 'omit' from source: magic vars 11762 1726853316.14966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853316.14995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853316.15010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853316.15023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853316.15033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853316.15066: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853316.15069: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853316.15072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853316.15136: Set connection var ansible_timeout to 10 11762 1726853316.15139: Set connection var ansible_shell_type to sh 11762 1726853316.15142: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853316.15149: Set connection var ansible_shell_executable to /bin/sh 11762 1726853316.15155: Set connection var ansible_pipelining to False 11762 1726853316.15167: Set connection var ansible_connection to ssh 11762 1726853316.15204: variable 'ansible_shell_executable' from source: unknown 11762 1726853316.15208: variable 'ansible_connection' from source: unknown 11762 1726853316.15211: variable 'ansible_module_compression' from source: unknown 11762 1726853316.15213: variable 'ansible_shell_type' from source: unknown 11762 1726853316.15215: variable 'ansible_shell_executable' from source: unknown 11762 1726853316.15217: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853316.15219: variable 'ansible_pipelining' from source: unknown 11762 1726853316.15221: variable 'ansible_timeout' from source: unknown 11762 1726853316.15223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853316.15374: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853316.15378: variable 'omit' from source: magic vars 11762 1726853316.15380: starting attempt loop 11762 1726853316.15385: running the handler 11762 1726853316.15396: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853316.15415: _low_level_execute_command(): starting 11762 1726853316.15424: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853316.16109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853316.16121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.16142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853316.16149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853316.16176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853316.16179: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853316.16182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.16184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853316.16251: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853316.16254: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853316.16256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.16257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853316.16259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853316.16261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853316.16263: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853316.16264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.16317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853316.16328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853316.16370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.16438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.18178: stdout chunk (state=3): >>>/root <<< 11762 1726853316.18267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.18476: stderr chunk (state=3): >>><<< 11762 1726853316.18480: stdout chunk (state=3): >>><<< 11762 1726853316.18484: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853316.18496: _low_level_execute_command(): starting 11762 1726853316.18499: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739 `" && echo ansible-tmp-1726853316.18326-14858-115576038551739="` echo /root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739 `" ) && sleep 0' 11762 1726853316.18913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853316.18922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.18934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853316.18977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853316.18980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853316.18983: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853316.18985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.18988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853316.19054: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853316.19057: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853316.19059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.19061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853316.19063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853316.19065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853316.19067: stderr chunk (state=3): >>>debug2: match found <<< 11762 1726853316.19068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.19107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853316.19118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853316.19127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.19228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.21266: stdout chunk (state=3): >>>ansible-tmp-1726853316.18326-14858-115576038551739=/root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739 <<< 11762 1726853316.21424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.21428: stdout chunk (state=3): >>><<< 11762 1726853316.21430: stderr chunk (state=3): >>><<< 11762 1726853316.21450: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853316.18326-14858-115576038551739=/root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853316.21581: variable 'ansible_module_compression' from source: unknown 11762 1726853316.21584: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853316.21595: variable 'ansible_facts' from source: unknown 11762 1726853316.21670: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739/AnsiballZ_command.py 11762 1726853316.21812: Sending initial data 11762 1726853316.21913: Sent initial data (154 bytes) 11762 1726853316.22583: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853316.22587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.22615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853316.22639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.22738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.24465: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853316.24564: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853316.24675: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpgr6yarlk /root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739/AnsiballZ_command.py <<< 11762 1726853316.24679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739/AnsiballZ_command.py" <<< 11762 1726853316.24781: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpgr6yarlk" to remote "/root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739/AnsiballZ_command.py" <<< 11762 1726853316.25786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.25828: stderr chunk (state=3): >>><<< 11762 1726853316.25853: stdout chunk (state=3): >>><<< 11762 1726853316.25881: done transferring module to remote 11762 1726853316.25898: _low_level_execute_command(): starting 11762 1726853316.25922: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739/ /root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739/AnsiballZ_command.py && sleep 0' 11762 1726853316.26330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.26334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.26355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.26402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853316.26407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.26487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.28412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.28428: stderr chunk (state=3): >>><<< 11762 1726853316.28431: stdout chunk (state=3): >>><<< 11762 1726853316.28444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853316.28448: _low_level_execute_command(): starting 11762 1726853316.28455: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739/AnsiballZ_command.py && sleep 0' 11762 1726853316.28870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.28875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 11762 1726853316.28878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.28880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.28882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 11762 1726853316.28885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.28927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853316.28931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.29012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.47489: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:28:36.446716", "end": "2024-09-20 13:28:36.472154", "delta": "0:00:00.025438", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853316.49468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853316.49475: stdout chunk (state=3): >>><<< 11762 1726853316.49478: stderr chunk (state=3): >>><<< 11762 1726853316.49481: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:28:36.446716", "end": "2024-09-20 13:28:36.472154", "delta": "0:00:00.025438", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853316.49489: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853316.49491: _low_level_execute_command(): starting 11762 1726853316.49494: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853316.18326-14858-115576038551739/ > /dev/null 2>&1 && sleep 0' 11762 1726853316.50473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853316.50599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853316.50640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.50710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.52631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.52687: stderr chunk (state=3): >>><<< 11762 1726853316.52697: stdout chunk (state=3): >>><<< 11762 1726853316.52718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853316.52730: handler run complete 11762 1726853316.52761: Evaluated conditional (False): False 11762 1726853316.52791: attempt loop complete, returning result 11762 1726853316.52798: _execute() done 11762 1726853316.52805: dumping result to json 11762 1726853316.52813: done dumping result, returning 11762 1726853316.52825: done running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services [02083763-bbaf-d845-03d0-000000000e56] 11762 1726853316.52836: sending task result for task 02083763-bbaf-d845-03d0-000000000e56 ok: [managed_node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.025438", "end": "2024-09-20 13:28:36.472154", "rc": 0, "start": "2024-09-20 13:28:36.446716" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11762 1726853316.53153: no more pending results, returning what we have 11762 1726853316.53157: results queue empty 11762 1726853316.53158: checking for any_errors_fatal 11762 1726853316.53168: done checking for any_errors_fatal 11762 1726853316.53169: checking for max_fail_percentage 11762 1726853316.53173: done checking for max_fail_percentage 11762 1726853316.53175: checking to see if all hosts have failed and the running result is not ok 11762 1726853316.53175: done checking to see if all hosts have failed 11762 1726853316.53176: getting the remaining hosts for this loop 11762 1726853316.53178: done getting the remaining hosts for this loop 11762 1726853316.53275: getting the next task for host managed_node2 11762 1726853316.53288: done getting next task for host managed_node2 11762 1726853316.53296: ^ task is: TASK: Check routes and DNS 11762 1726853316.53301: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853316.53305: getting variables 11762 1726853316.53306: in VariableManager get_vars() 11762 1726853316.53361: Calling all_inventory to load vars for managed_node2 11762 1726853316.53364: Calling groups_inventory to load vars for managed_node2 11762 1726853316.53367: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853316.53509: Calling all_plugins_play to load vars for managed_node2 11762 1726853316.53515: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853316.53518: Calling groups_plugins_play to load vars for managed_node2 11762 1726853316.54184: done sending task result for task 02083763-bbaf-d845-03d0-000000000e56 11762 1726853316.54188: WORKER PROCESS EXITING 11762 1726853316.55239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853316.56996: done with get_vars() 11762 1726853316.57028: done getting variables 11762 1726853316.57109: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:28:36 -0400 (0:00:00.432) 0:01:07.001 ****** 11762 1726853316.57155: entering _queue_task() for managed_node2/shell 11762 1726853316.57536: worker is 1 (out of 1 available) 11762 1726853316.57551: exiting _queue_task() for managed_node2/shell 11762 1726853316.57565: done queuing things up, now waiting for results queue to drain 11762 1726853316.57566: waiting for pending results... 11762 1726853316.57884: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 11762 1726853316.58008: in run() - task 02083763-bbaf-d845-03d0-000000000e5a 11762 1726853316.58029: variable 'ansible_search_path' from source: unknown 11762 1726853316.58037: variable 'ansible_search_path' from source: unknown 11762 1726853316.58082: calling self._execute() 11762 1726853316.58182: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853316.58197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853316.58213: variable 'omit' from source: magic vars 11762 1726853316.58583: variable 'ansible_distribution_major_version' from source: facts 11762 1726853316.58600: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853316.58611: variable 'omit' from source: magic vars 11762 1726853316.58657: variable 'omit' from source: magic vars 11762 1726853316.58696: variable 'omit' from source: magic vars 11762 1726853316.58739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853316.58780: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853316.58808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853316.58829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853316.58844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853316.58879: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853316.58895: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853316.58897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853316.59004: Set connection var ansible_timeout to 10 11762 1726853316.59007: Set connection var ansible_shell_type to sh 11762 1726853316.59176: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853316.59178: Set connection var ansible_shell_executable to /bin/sh 11762 1726853316.59181: Set connection var ansible_pipelining to False 11762 1726853316.59182: Set connection var ansible_connection to ssh 11762 1726853316.59184: variable 'ansible_shell_executable' from source: unknown 11762 1726853316.59186: variable 'ansible_connection' from source: unknown 11762 1726853316.59188: variable 'ansible_module_compression' from source: unknown 11762 1726853316.59189: variable 'ansible_shell_type' from source: unknown 11762 1726853316.59191: variable 'ansible_shell_executable' from source: unknown 11762 1726853316.59193: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853316.59194: variable 'ansible_pipelining' from source: unknown 11762 1726853316.59196: variable 'ansible_timeout' from source: unknown 11762 1726853316.59198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853316.59232: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853316.59250: variable 'omit' from source: magic vars 11762 1726853316.59258: starting attempt loop 11762 1726853316.59264: running the handler 11762 1726853316.59281: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853316.59304: _low_level_execute_command(): starting 11762 1726853316.59322: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853316.60031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853316.60049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.60092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.60109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853316.60199: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853316.60248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.60329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.62075: stdout chunk (state=3): >>>/root <<< 11762 1726853316.62208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.62227: stdout chunk (state=3): >>><<< 11762 1726853316.62240: stderr chunk (state=3): >>><<< 11762 1726853316.62272: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853316.62293: _low_level_execute_command(): starting 11762 1726853316.62304: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069 `" && echo ansible-tmp-1726853316.6228018-14883-139584893749069="` echo /root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069 `" ) && sleep 0' 11762 1726853316.63083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853316.63108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.63166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853316.63200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853316.63217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.63324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.65339: stdout chunk (state=3): >>>ansible-tmp-1726853316.6228018-14883-139584893749069=/root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069 <<< 11762 1726853316.65502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.65505: stdout chunk (state=3): >>><<< 11762 1726853316.65508: stderr chunk (state=3): >>><<< 11762 1726853316.65677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853316.6228018-14883-139584893749069=/root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853316.65680: variable 'ansible_module_compression' from source: unknown 11762 1726853316.65682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853316.65684: variable 'ansible_facts' from source: unknown 11762 1726853316.65784: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069/AnsiballZ_command.py 11762 1726853316.66042: Sending initial data 11762 1726853316.66047: Sent initial data (156 bytes) 11762 1726853316.66633: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853316.66650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.66666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853316.66786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853316.66805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.66909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.68613: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11762 1726853316.68631: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853316.68724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853316.68805: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpr8uvlj6d /root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069/AnsiballZ_command.py <<< 11762 1726853316.68808: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069/AnsiballZ_command.py" <<< 11762 1726853316.68893: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpr8uvlj6d" to remote "/root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069/AnsiballZ_command.py" <<< 11762 1726853316.70016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.70019: stdout chunk (state=3): >>><<< 11762 1726853316.70022: stderr chunk (state=3): >>><<< 11762 1726853316.70024: done transferring module to remote 11762 1726853316.70035: _low_level_execute_command(): starting 11762 1726853316.70044: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069/ /root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069/AnsiballZ_command.py && sleep 0' 11762 1726853316.70790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853316.70794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.70826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853316.70843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853316.70867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.70977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.72888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.72941: stderr chunk (state=3): >>><<< 11762 1726853316.72944: stdout chunk (state=3): >>><<< 11762 1726853316.72964: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853316.72967: _low_level_execute_command(): starting 11762 1726853316.72975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069/AnsiballZ_command.py && sleep 0' 11762 1726853316.73605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853316.73609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.73677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853316.73680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853316.73683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853316.73685: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853316.73687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.73689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11762 1726853316.73692: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 11762 1726853316.73694: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11762 1726853316.73703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.73713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853316.73734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853316.73784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853316.73812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853316.73827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853316.73873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.74074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.90573: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:bc:da:29:a4:45 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.197/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3428sec preferred_lft 3428sec\n inet6 fe80::10bc:daff:fe29:a445/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.197 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.197 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:28:36.895642", "end": "2024-09-20 13:28:36.904594", "delta": "0:00:00.008952", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 11762 1726853316.90596: stdout chunk (state=3): >>> <<< 11762 1726853316.92457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853316.92461: stdout chunk (state=3): >>><<< 11762 1726853316.92463: stderr chunk (state=3): >>><<< 11762 1726853316.92466: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:bc:da:29:a4:45 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.197/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3428sec preferred_lft 3428sec\n inet6 fe80::10bc:daff:fe29:a445/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.197 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.197 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:28:36.895642", "end": "2024-09-20 13:28:36.904594", "delta": "0:00:00.008952", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853316.92476: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853316.92478: _low_level_execute_command(): starting 11762 1726853316.92480: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853316.6228018-14883-139584893749069/ > /dev/null 2>&1 && sleep 0' 11762 1726853316.93055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853316.93070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853316.93091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853316.93110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853316.93126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 11762 1726853316.93235: stderr chunk (state=3): >>>debug2: match not found <<< 11762 1726853316.93239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853316.93287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853316.93358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853316.95315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853316.95318: stdout chunk (state=3): >>><<< 11762 1726853316.95320: stderr chunk (state=3): >>><<< 11762 1726853316.95339: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853316.95350: handler run complete 11762 1726853316.95380: Evaluated conditional (False): False 11762 1726853316.95475: attempt loop complete, returning result 11762 1726853316.95478: _execute() done 11762 1726853316.95481: dumping result to json 11762 1726853316.95483: done dumping result, returning 11762 1726853316.95485: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [02083763-bbaf-d845-03d0-000000000e5a] 11762 1726853316.95487: sending task result for task 02083763-bbaf-d845-03d0-000000000e5a 11762 1726853316.95561: done sending task result for task 02083763-bbaf-d845-03d0-000000000e5a 11762 1726853316.95563: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008952", "end": "2024-09-20 13:28:36.904594", "rc": 0, "start": "2024-09-20 13:28:36.895642" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:bc:da:29:a4:45 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.197/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3428sec preferred_lft 3428sec inet6 fe80::10bc:daff:fe29:a445/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.197 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.197 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11762 1726853316.95636: no more pending results, returning what we have 11762 1726853316.95640: results queue empty 11762 1726853316.95641: checking for any_errors_fatal 11762 1726853316.95654: done checking for any_errors_fatal 11762 1726853316.95655: checking for max_fail_percentage 11762 1726853316.95658: done checking for max_fail_percentage 11762 1726853316.95659: checking to see if all hosts have failed and the running result is not ok 11762 1726853316.95659: done checking to see if all hosts have failed 11762 1726853316.95660: getting the remaining hosts for this loop 11762 1726853316.95662: done getting the remaining hosts for this loop 11762 1726853316.95666: getting the next task for host managed_node2 11762 1726853316.95675: done getting next task for host managed_node2 11762 1726853316.95677: ^ task is: TASK: Verify DNS and network connectivity 11762 1726853316.95681: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853316.95691: getting variables 11762 1726853316.95693: in VariableManager get_vars() 11762 1726853316.95742: Calling all_inventory to load vars for managed_node2 11762 1726853316.95745: Calling groups_inventory to load vars for managed_node2 11762 1726853316.95747: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853316.95758: Calling all_plugins_play to load vars for managed_node2 11762 1726853316.95761: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853316.95764: Calling groups_plugins_play to load vars for managed_node2 11762 1726853316.97491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853316.99042: done with get_vars() 11762 1726853316.99072: done getting variables 11762 1726853316.99127: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:28:36 -0400 (0:00:00.420) 0:01:07.422 ****** 11762 1726853316.99161: entering _queue_task() for managed_node2/shell 11762 1726853316.99612: worker is 1 (out of 1 available) 11762 1726853316.99622: exiting _queue_task() for managed_node2/shell 11762 1726853316.99633: done queuing things up, now waiting for results queue to drain 11762 1726853316.99635: waiting for pending results... 11762 1726853317.00037: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 11762 1726853317.00041: in run() - task 02083763-bbaf-d845-03d0-000000000e5b 11762 1726853317.00044: variable 'ansible_search_path' from source: unknown 11762 1726853317.00046: variable 'ansible_search_path' from source: unknown 11762 1726853317.00049: calling self._execute() 11762 1726853317.00144: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853317.00156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853317.00173: variable 'omit' from source: magic vars 11762 1726853317.00565: variable 'ansible_distribution_major_version' from source: facts 11762 1726853317.00586: Evaluated conditional (ansible_distribution_major_version != '6'): True 11762 1726853317.00735: variable 'ansible_facts' from source: unknown 11762 1726853317.01696: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11762 1726853317.01762: variable 'omit' from source: magic vars 11762 1726853317.01774: variable 'omit' from source: magic vars 11762 1726853317.01811: variable 'omit' from source: magic vars 11762 1726853317.01861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11762 1726853317.01910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11762 1726853317.01932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11762 1726853317.01954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853317.01979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11762 1726853317.02076: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11762 1726853317.02081: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853317.02084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853317.02132: Set connection var ansible_timeout to 10 11762 1726853317.02141: Set connection var ansible_shell_type to sh 11762 1726853317.02153: Set connection var ansible_module_compression to ZIP_DEFLATED 11762 1726853317.02162: Set connection var ansible_shell_executable to /bin/sh 11762 1726853317.02176: Set connection var ansible_pipelining to False 11762 1726853317.02187: Set connection var ansible_connection to ssh 11762 1726853317.02276: variable 'ansible_shell_executable' from source: unknown 11762 1726853317.02279: variable 'ansible_connection' from source: unknown 11762 1726853317.02281: variable 'ansible_module_compression' from source: unknown 11762 1726853317.02283: variable 'ansible_shell_type' from source: unknown 11762 1726853317.02285: variable 'ansible_shell_executable' from source: unknown 11762 1726853317.02286: variable 'ansible_host' from source: host vars for 'managed_node2' 11762 1726853317.02288: variable 'ansible_pipelining' from source: unknown 11762 1726853317.02289: variable 'ansible_timeout' from source: unknown 11762 1726853317.02291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11762 1726853317.02396: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853317.02416: variable 'omit' from source: magic vars 11762 1726853317.02429: starting attempt loop 11762 1726853317.02434: running the handler 11762 1726853317.02447: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11762 1726853317.02468: _low_level_execute_command(): starting 11762 1726853317.02481: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11762 1726853317.03364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853317.03473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853317.03509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853317.03593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853317.05315: stdout chunk (state=3): >>>/root <<< 11762 1726853317.05481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853317.05484: stdout chunk (state=3): >>><<< 11762 1726853317.05488: stderr chunk (state=3): >>><<< 11762 1726853317.05511: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853317.05530: _low_level_execute_command(): starting 11762 1726853317.05616: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895 `" && echo ansible-tmp-1726853317.0551798-14902-194415732110895="` echo /root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895 `" ) && sleep 0' 11762 1726853317.06165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853317.06182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853317.06197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853317.06234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853317.06287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853317.06341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853317.06367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853317.06403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853317.06537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853317.08578: stdout chunk (state=3): >>>ansible-tmp-1726853317.0551798-14902-194415732110895=/root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895 <<< 11762 1726853317.08738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853317.08742: stdout chunk (state=3): >>><<< 11762 1726853317.08744: stderr chunk (state=3): >>><<< 11762 1726853317.08763: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853317.0551798-14902-194415732110895=/root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853317.08976: variable 'ansible_module_compression' from source: unknown 11762 1726853317.08979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11762dxnuypj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11762 1726853317.08981: variable 'ansible_facts' from source: unknown 11762 1726853317.08992: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895/AnsiballZ_command.py 11762 1726853317.09228: Sending initial data 11762 1726853317.09231: Sent initial data (156 bytes) 11762 1726853317.09751: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853317.09762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853317.09776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853317.09877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853317.09919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853317.09994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853317.11695: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11762 1726853317.11788: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11762 1726853317.11891: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpwm9ow_8g /root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895/AnsiballZ_command.py <<< 11762 1726853317.11901: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895/AnsiballZ_command.py" <<< 11762 1726853317.11958: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11762dxnuypj1/tmpwm9ow_8g" to remote "/root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895/AnsiballZ_command.py" <<< 11762 1726853317.12842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853317.12861: stdout chunk (state=3): >>><<< 11762 1726853317.12875: stderr chunk (state=3): >>><<< 11762 1726853317.12911: done transferring module to remote 11762 1726853317.12927: _low_level_execute_command(): starting 11762 1726853317.13013: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895/ /root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895/AnsiballZ_command.py && sleep 0' 11762 1726853317.13588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853317.13603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853317.13624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853317.13645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853317.13693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11762 1726853317.13774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853317.13801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853317.13840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853317.13912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853317.15830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853317.15833: stdout chunk (state=3): >>><<< 11762 1726853317.15835: stderr chunk (state=3): >>><<< 11762 1726853317.15930: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853317.15935: _low_level_execute_command(): starting 11762 1726853317.15938: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895/AnsiballZ_command.py && sleep 0' 11762 1726853317.16594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853317.16597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 11762 1726853317.16619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853317.16636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853317.16746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853317.44397: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 13125 0 --:--:-- --:--:-- --:--:-- 13260\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3906 0 --:--:-- --:--:-- --:--:-- 3932", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:28:37.321662", "end": "2024-09-20 13:28:37.442701", "delta": "0:00:00.121039", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11762 1726853317.46121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 11762 1726853317.46131: stdout chunk (state=3): >>><<< 11762 1726853317.46252: stderr chunk (state=3): >>><<< 11762 1726853317.46257: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 13125 0 --:--:-- --:--:-- --:--:-- 13260\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3906 0 --:--:-- --:--:-- --:--:-- 3932", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:28:37.321662", "end": "2024-09-20 13:28:37.442701", "delta": "0:00:00.121039", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 11762 1726853317.46267: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11762 1726853317.46288: _low_level_execute_command(): starting 11762 1726853317.46298: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853317.0551798-14902-194415732110895/ > /dev/null 2>&1 && sleep 0' 11762 1726853317.46965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11762 1726853317.46983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11762 1726853317.46997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11762 1726853317.47020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11762 1726853317.47130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 11762 1726853317.47164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11762 1726853317.47275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11762 1726853317.49165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11762 1726853317.49221: stderr chunk (state=3): >>><<< 11762 1726853317.49235: stdout chunk (state=3): >>><<< 11762 1726853317.49268: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11762 1726853317.49286: handler run complete 11762 1726853317.49314: Evaluated conditional (False): False 11762 1726853317.49331: attempt loop complete, returning result 11762 1726853317.49338: _execute() done 11762 1726853317.49348: dumping result to json 11762 1726853317.49476: done dumping result, returning 11762 1726853317.49480: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [02083763-bbaf-d845-03d0-000000000e5b] 11762 1726853317.49482: sending task result for task 02083763-bbaf-d845-03d0-000000000e5b ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.121039", "end": "2024-09-20 13:28:37.442701", "rc": 0, "start": "2024-09-20 13:28:37.321662" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 13125 0 --:--:-- --:--:-- --:--:-- 13260 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 3906 0 --:--:-- --:--:-- --:--:-- 3932 11762 1726853317.49636: no more pending results, returning what we have 11762 1726853317.49640: results queue empty 11762 1726853317.49641: checking for any_errors_fatal 11762 1726853317.49652: done checking for any_errors_fatal 11762 1726853317.49652: checking for max_fail_percentage 11762 1726853317.49655: done checking for max_fail_percentage 11762 1726853317.49656: checking to see if all hosts have failed and the running result is not ok 11762 1726853317.49656: done checking to see if all hosts have failed 11762 1726853317.49657: getting the remaining hosts for this loop 11762 1726853317.49659: done getting the remaining hosts for this loop 11762 1726853317.49662: getting the next task for host managed_node2 11762 1726853317.49676: done getting next task for host managed_node2 11762 1726853317.49678: ^ task is: TASK: meta (flush_handlers) 11762 1726853317.49684: done sending task result for task 02083763-bbaf-d845-03d0-000000000e5b 11762 1726853317.49782: WORKER PROCESS EXITING 11762 1726853317.49777: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853317.49790: getting variables 11762 1726853317.49791: in VariableManager get_vars() 11762 1726853317.49842: Calling all_inventory to load vars for managed_node2 11762 1726853317.49848: Calling groups_inventory to load vars for managed_node2 11762 1726853317.49851: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853317.49862: Calling all_plugins_play to load vars for managed_node2 11762 1726853317.49865: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853317.49868: Calling groups_plugins_play to load vars for managed_node2 11762 1726853317.60492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853317.61983: done with get_vars() 11762 1726853317.62009: done getting variables 11762 1726853317.62076: in VariableManager get_vars() 11762 1726853317.62096: Calling all_inventory to load vars for managed_node2 11762 1726853317.62099: Calling groups_inventory to load vars for managed_node2 11762 1726853317.62101: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853317.62106: Calling all_plugins_play to load vars for managed_node2 11762 1726853317.62109: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853317.62112: Calling groups_plugins_play to load vars for managed_node2 11762 1726853317.63230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853317.64889: done with get_vars() 11762 1726853317.64914: done queuing things up, now waiting for results queue to drain 11762 1726853317.64916: results queue empty 11762 1726853317.64917: checking for any_errors_fatal 11762 1726853317.64921: done checking for any_errors_fatal 11762 1726853317.64922: checking for max_fail_percentage 11762 1726853317.64923: done checking for max_fail_percentage 11762 1726853317.64924: checking to see if all hosts have failed and the running result is not ok 11762 1726853317.64925: done checking to see if all hosts have failed 11762 1726853317.64925: getting the remaining hosts for this loop 11762 1726853317.64926: done getting the remaining hosts for this loop 11762 1726853317.64929: getting the next task for host managed_node2 11762 1726853317.64933: done getting next task for host managed_node2 11762 1726853317.64934: ^ task is: TASK: meta (flush_handlers) 11762 1726853317.64936: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853317.64939: getting variables 11762 1726853317.64940: in VariableManager get_vars() 11762 1726853317.64958: Calling all_inventory to load vars for managed_node2 11762 1726853317.64960: Calling groups_inventory to load vars for managed_node2 11762 1726853317.64962: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853317.64968: Calling all_plugins_play to load vars for managed_node2 11762 1726853317.64970: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853317.64975: Calling groups_plugins_play to load vars for managed_node2 11762 1726853317.66076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853317.67563: done with get_vars() 11762 1726853317.67587: done getting variables 11762 1726853317.67640: in VariableManager get_vars() 11762 1726853317.67661: Calling all_inventory to load vars for managed_node2 11762 1726853317.67664: Calling groups_inventory to load vars for managed_node2 11762 1726853317.67666: Calling all_plugins_inventory to load vars for managed_node2 11762 1726853317.67673: Calling all_plugins_play to load vars for managed_node2 11762 1726853317.67676: Calling groups_plugins_inventory to load vars for managed_node2 11762 1726853317.67679: Calling groups_plugins_play to load vars for managed_node2 11762 1726853317.68839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11762 1726853317.70411: done with get_vars() 11762 1726853317.70436: done queuing things up, now waiting for results queue to drain 11762 1726853317.70438: results queue empty 11762 1726853317.70439: checking for any_errors_fatal 11762 1726853317.70440: done checking for any_errors_fatal 11762 1726853317.70441: checking for max_fail_percentage 11762 1726853317.70442: done checking for max_fail_percentage 11762 1726853317.70443: checking to see if all hosts have failed and the running result is not ok 11762 1726853317.70443: done checking to see if all hosts have failed 11762 1726853317.70444: getting the remaining hosts for this loop 11762 1726853317.70447: done getting the remaining hosts for this loop 11762 1726853317.70450: getting the next task for host managed_node2 11762 1726853317.70453: done getting next task for host managed_node2 11762 1726853317.70454: ^ task is: None 11762 1726853317.70456: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11762 1726853317.70457: done queuing things up, now waiting for results queue to drain 11762 1726853317.70458: results queue empty 11762 1726853317.70458: checking for any_errors_fatal 11762 1726853317.70459: done checking for any_errors_fatal 11762 1726853317.70460: checking for max_fail_percentage 11762 1726853317.70461: done checking for max_fail_percentage 11762 1726853317.70461: checking to see if all hosts have failed and the running result is not ok 11762 1726853317.70462: done checking to see if all hosts have failed 11762 1726853317.70463: getting the next task for host managed_node2 11762 1726853317.70466: done getting next task for host managed_node2 11762 1726853317.70466: ^ task is: None 11762 1726853317.70467: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=148 changed=5 unreachable=0 failed=0 skipped=97 rescued=0 ignored=0 Friday 20 September 2024 13:28:37 -0400 (0:00:00.713) 0:01:08.135 ****** =============================================================================== ** TEST check bond settings --------------------------------------------- 6.31s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Install dnsmasq --------------------------------------------------------- 2.34s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.09s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.07s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.99s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 ** TEST check bond settings --------------------------------------------- 1.90s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.86s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.78s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.67s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 Create test interfaces -------------------------------------------------- 1.58s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Check which packages are installed --- 1.28s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.18s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.07s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.07s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.00s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.95s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.94s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.92s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.90s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 11762 1726853317.70584: RUNNING CLEANUP